Apr 16 20:11:57.029786 ip-10-0-137-142 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:11:57.501182 ip-10-0-137-142 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:57.501182 ip-10-0-137-142 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:11:57.501182 ip-10-0-137-142 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:57.501182 ip-10-0-137-142 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:11:57.501182 ip-10-0-137-142 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:57.503015 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.502888 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:11:57.506206 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506180 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:57.506206 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506200 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:57.506206 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506203 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:57.506206 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506207 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:57.506206 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506210 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:57.506206 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506213 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506217 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506220 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506223 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506225 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506228 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506231 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506233 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506236 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506238 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506240 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506243 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506246 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506248 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506251 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506253 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506256 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506259 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506261 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:57.506449 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506264 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506269 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506272 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506274 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506277 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506279 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506282 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506284 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506287 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506289 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506292 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506295 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506297 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506300 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506302 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506307 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506309 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506312 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506314 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506317 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:57.506904 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506320 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506322 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506325 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506328 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506330 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506333 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506336 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506339 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506341 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506344 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506348 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506352 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506355 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506357 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506360 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506362 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506365 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506367 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506370 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506390 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:57.507429 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506393 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506396 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506399 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506401 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506404 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506406 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506409 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506412 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506415 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506417 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506421 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506424 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506426 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506429 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506440 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506443 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506446 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506450 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506454 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:57.507915 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506458 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506461 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506464 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506900 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506904 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506909 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506912 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506915 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506918 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506920 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506923 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506926 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506928 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506930 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506933 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506936 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506938 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506941 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506943 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:57.508433 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506946 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506950 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506952 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506955 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506957 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506960 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506963 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506965 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506975 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506979 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506982 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506985 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506987 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506990 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506993 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506996 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.506998 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507001 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507003 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:57.508884 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507006 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507008 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507011 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507013 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507015 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507020 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507022 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507025 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507027 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507029 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507032 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507035 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507037 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507040 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507042 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507045 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507047 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507050 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507052 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507055 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:57.509360 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507057 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507060 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507068 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507071 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507073 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507076 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507078 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507081 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507083 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507085 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507088 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507090 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507093 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507095 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507097 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507100 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507102 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507105 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507107 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507110 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:57.509875 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507112 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507115 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507117 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507120 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507122 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507125 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507127 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507130 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507132 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507135 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507137 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507222 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507229 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507244 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507248 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507256 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507260 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507264 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507269 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507273 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507276 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:11:57.510393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507280 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507283 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507286 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507290 2572 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507292 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507295 2572 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507303 2572 flags.go:64] FLAG: --cloud-config="" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507305 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507308 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507315 2572 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507317 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507321 2572 flags.go:64] FLAG: --config-dir="" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507323 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507326 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507331 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507333 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507337 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507340 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507343 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507346 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507349 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507352 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507355 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507359 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507362 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:11:57.510913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507365 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507368 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507390 2572 flags.go:64] FLAG: --enable-server="true" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507396 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507408 2572 flags.go:64] FLAG: --event-burst="100" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507411 2572 flags.go:64] FLAG: --event-qps="50" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507414 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507417 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507420 2572 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507423 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507426 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507430 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507433 2572 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507437 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507440 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507443 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507446 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507449 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507452 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507455 2572 flags.go:64] FLAG: --feature-gates="" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507458 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507462 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507465 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507468 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507471 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:11:57.511534 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507474 2572 flags.go:64] FLAG: --help="false" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507477 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-137-142.ec2.internal" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507480 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507483 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507485 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507489 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507492 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507494 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507497 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507500 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507503 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507507 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507510 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507513 2572 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507516 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507518 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507521 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507524 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507527 2572 flags.go:64] FLAG: --lock-file="" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507529 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507533 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507536 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507541 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:11:57.512143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507544 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507547 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507550 2572 flags.go:64] FLAG: --logging-format="text" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507553 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507556 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507559 2572 flags.go:64] FLAG: --manifest-url="" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507562 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507566 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507570 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507574 2572 flags.go:64] FLAG: --max-pods="110" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507577 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507580 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507582 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507585 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507588 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507591 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507594 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507602 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507605 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507608 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507612 2572 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507616 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507621 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507624 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:11:57.512706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507627 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507630 2572 flags.go:64] FLAG: --port="10250" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507633 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507636 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d7f124c9fe1512bc" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507639 2572 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507642 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507646 2572 flags.go:64] FLAG: --register-node="true" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507648 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507651 2572 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507655 2572 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507658 2572 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507660 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507663 2572 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507666 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507669 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507673 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507675 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507678 2572 flags.go:64] FLAG: --runonce="false" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507681 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507684 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507687 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507689 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507692 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507695 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507698 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507701 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:11:57.513300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507704 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507707 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507712 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507715 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507718 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507721 2572 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507724 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507729 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507732 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507735 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507739 2572 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507741 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507745 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507748 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507751 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507753 2572 flags.go:64] FLAG: --v="2" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507758 2572 flags.go:64] FLAG: --version="false" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507762 2572 flags.go:64] FLAG: --vmodule="" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507766 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.507769 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507873 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507878 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507882 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:57.513953 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507885 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507888 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507890 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507893 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507895 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507898 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507900 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507903 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507905 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507907 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507910 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507917 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507920 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507924 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507926 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507929 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507932 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507934 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507937 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507940 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:57.514544 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507942 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507945 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507948 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507951 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507953 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507955 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507958 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507961 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507963 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507966 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507968 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507971 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507973 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507976 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507978 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507980 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507983 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507985 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507988 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:57.515074 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507990 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507992 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507995 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.507997 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508000 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508003 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508006 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508009 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508011 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508013 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508016 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508018 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508021 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508023 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508027 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508029 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508032 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508034 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508037 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508039 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:57.515559 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508042 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508044 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508046 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508049 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508051 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508054 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508056 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508059 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508061 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508064 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508066 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508068 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508071 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508073 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508075 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508078 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508082 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508086 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508089 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508091 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:57.516047 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508094 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508097 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508099 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.508102 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.509248 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.515837 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.515855 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515918 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515923 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515927 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515930 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515933 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515935 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515938 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515941 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:57.516565 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515944 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515948 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515953 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515956 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515958 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515961 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515963 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515966 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515968 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515971 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515973 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515976 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515978 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515981 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515983 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515985 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515988 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515991 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515994 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515996 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:57.516982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.515998 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516001 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516004 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516007 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516010 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516013 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516015 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516018 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516021 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516024 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516026 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516029 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516032 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516035 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516037 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516040 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516043 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516045 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516048 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:57.517495 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516051 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516054 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516056 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516059 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516062 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516064 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516067 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516069 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516072 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516074 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516076 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516079 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516082 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516085 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516087 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516090 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516093 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516096 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516098 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516101 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:57.518070 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516104 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516106 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516109 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516111 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516114 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516118 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516121 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516125 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516127 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516132 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516135 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516138 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516140 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516142 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516145 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516147 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516150 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516152 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:57.518597 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516155 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.516160 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516267 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516271 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516274 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516277 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516280 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516283 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516285 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516288 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516292 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516294 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516297 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516299 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516302 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516304 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:57.519048 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516307 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516309 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516312 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516314 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516317 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516319 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516322 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516324 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516327 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516330 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516332 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516335 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516337 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516339 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516342 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516346 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516349 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516352 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516355 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:57.519463 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516358 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516361 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516363 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516366 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516370 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516396 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516399 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516402 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516405 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516408 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516411 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516413 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516416 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516418 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516421 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516424 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516426 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516429 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516431 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:57.519929 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516434 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516436 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516439 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516442 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516445 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516447 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516449 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516452 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516454 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516457 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516459 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516462 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516464 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516467 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516469 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516472 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516474 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516476 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516479 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516482 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:57.520417 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516485 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516487 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516490 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516492 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516494 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516497 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516499 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516502 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516504 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516506 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516509 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516511 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516514 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:57.516516 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.516521 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:57.520966 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.517276 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:11:57.521337 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.520731 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:11:57.521801 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.521790 2572 server.go:1019] "Starting client certificate rotation" Apr 16 20:11:57.521913 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.521885 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:57.521952 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.521937 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:57.549852 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.549820 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:57.551638 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.551615 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:57.568333 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.568306 2572 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:11:57.573969 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.573940 2572 log.go:25] "Validated CRI v1 image API" Apr 16 20:11:57.575140 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.575121 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:11:57.575484 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.575467 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:57.578214 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.578190 2572 fs.go:135] Filesystem UUIDs: map[4e747c3d-2a16-4b36-83bd-77340b9d661c:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 9afb39e4-1289-478a-8f88-45829c48bf18:/dev/nvme0n1p3] Apr 16 20:11:57.578302 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.578212 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:11:57.584602 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.584493 2572 manager.go:217] Machine: {Timestamp:2026-04-16 20:11:57.582510299 +0000 UTC m=+0.423410680 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101275 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec235fed264f800c19194f65f969dce8 SystemUUID:ec235fed-264f-800c-1919-4f65f969dce8 BootID:e548277b-1bb0-4084-bb45-a4dd397e1a75 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ef:bb:7d:00:71 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ef:bb:7d:00:71 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fe:cb:f4:fb:78:5d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:11:57.584602 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.584596 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:11:57.584735 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.584723 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:11:57.585689 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.585662 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:11:57.585837 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.585692 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-142.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:11:57.585880 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.585847 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:11:57.585880 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.585855 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:11:57.585880 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.585874 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:57.586802 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.586791 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:57.588126 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.588114 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:57.588242 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.588233 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:11:57.590575 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.590565 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:11:57.590613 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.590582 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:11:57.590613 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.590596 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:11:57.590613 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.590605 2572 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:11:57.590613 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.590614 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:11:57.591769 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.591757 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:57.591825 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.591774 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:57.595145 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.595128 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:11:57.596503 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.596483 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-58fhj" Apr 16 20:11:57.596704 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.596689 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:11:57.598968 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.598951 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:11:57.599051 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.598972 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:11:57.599051 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.598982 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:11:57.599051 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.598989 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:11:57.599051 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.598997 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:11:57.599051 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.599005 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:11:57.599051 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.599013 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:11:57.599051 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.599022 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:11:57.599051 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.599044 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:11:57.599051 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.599053 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:11:57.599315 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.599072 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:11:57.599315 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.599087 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:11:57.599872 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.599860 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:11:57.599931 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.599874 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:11:57.601267 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.601235 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:11:57.601267 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.601235 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:11:57.602948 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.602929 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:11:57.603605 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.603591 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:11:57.603680 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.603636 2572 server.go:1295] "Started kubelet" Apr 16 20:11:57.603731 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.603678 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:11:57.607487 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.603901 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:11:57.607710 ip-10-0-137-142 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:11:57.607897 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.607708 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:11:57.608431 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.608408 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-58fhj" Apr 16 20:11:57.608847 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.608809 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:11:57.611568 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.611552 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:11:57.616061 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.616025 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:57.616760 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.616745 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:11:57.617391 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.617356 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:11:57.617510 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.617361 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:11:57.617510 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.617512 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:11:57.617655 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.617483 2572 factory.go:55] Registering systemd factory Apr 16 20:11:57.617655 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.617582 2572 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:11:57.618075 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.617784 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:11:57.618075 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.617830 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:11:57.618290 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.618277 2572 factory.go:153] Registering CRI-O factory Apr 16 20:11:57.618363 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.618354 2572 factory.go:223] Registration of the crio container factory successfully Apr 16 20:11:57.618507 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.618494 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:11:57.618596 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.618587 2572 factory.go:103] Registering Raw factory Apr 16 20:11:57.618661 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.618654 2572 manager.go:1196] Started watching for new ooms in manager Apr 16 20:11:57.619104 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.619082 2572 manager.go:319] Starting recovery of all containers Apr 16 20:11:57.620510 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.619521 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:57.620510 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.619082 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-142.ec2.internal\" not found" Apr 16 20:11:57.622816 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.622779 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:11:57.623058 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.623040 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-142.ec2.internal\" not found" node="ip-10-0-137-142.ec2.internal" Apr 16 20:11:57.629911 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.629885 2572 manager.go:324] Recovery completed Apr 16 20:11:57.631495 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.631469 2572 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 20:11:57.636108 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.636093 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:57.638347 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.638330 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:57.638441 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.638361 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:57.638441 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.638393 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:57.638845 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.638834 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:11:57.638876 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.638845 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:11:57.638876 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.638862 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:57.641139 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.641128 2572 policy_none.go:49] "None policy: Start" Apr 16 20:11:57.641181 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.641144 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:11:57.641181 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.641156 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:11:57.679067 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.679050 2572 manager.go:341] "Starting Device Plugin manager" Apr 16 20:11:57.691570 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.679087 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:11:57.691570 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.679097 2572 server.go:85] "Starting device plugin registration server" Apr 16 20:11:57.691570 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.679406 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:11:57.691570 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.679422 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:11:57.691570 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.679527 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:11:57.691570 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.679632 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:11:57.691570 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.679642 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:11:57.691570 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.680130 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:11:57.691570 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.680358 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-142.ec2.internal\" not found" Apr 16 20:11:57.756643 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.756556 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:11:57.757966 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.757944 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:11:57.758052 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.757975 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:11:57.758052 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.758001 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:11:57.758052 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.758014 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:11:57.758161 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.758061 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:11:57.762507 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.762483 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:57.779584 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.779562 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:57.780504 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.780487 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:57.780581 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.780518 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:57.780581 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.780527 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:57.780581 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.780551 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-142.ec2.internal" Apr 16 20:11:57.789763 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.789748 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-142.ec2.internal" Apr 16 20:11:57.789821 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.789771 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-142.ec2.internal\": node \"ip-10-0-137-142.ec2.internal\" not found" Apr 16 20:11:57.804189 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.804170 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-142.ec2.internal\" not found" Apr 16 20:11:57.859058 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.859011 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-142.ec2.internal"] Apr 16 20:11:57.859138 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.859117 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:57.860121 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.860107 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:57.860195 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.860135 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:57.860195 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.860146 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:57.861433 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.861421 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:57.861514 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.861500 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal" Apr 16 20:11:57.861556 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.861534 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:57.862311 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.862290 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:57.862426 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.862321 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:57.862426 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.862331 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:57.862426 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.862293 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:57.862426 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.862410 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:57.862426 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.862424 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:57.863755 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.863741 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-142.ec2.internal" Apr 16 20:11:57.863831 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.863767 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:57.864796 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.864783 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:57.864861 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.864809 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:57.864861 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.864821 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:57.890040 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.890019 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-142.ec2.internal\" not found" node="ip-10-0-137-142.ec2.internal" Apr 16 20:11:57.894557 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.894537 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-142.ec2.internal\" not found" node="ip-10-0-137-142.ec2.internal" Apr 16 20:11:57.904277 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:57.904250 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-142.ec2.internal\" not found" Apr 16 20:11:57.919299 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.919270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/80a7b203f53c53f317b0eec338157b8d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal\" (UID: \"80a7b203f53c53f317b0eec338157b8d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal" Apr 16 20:11:57.919501 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.919303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/80a7b203f53c53f317b0eec338157b8d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal\" (UID: \"80a7b203f53c53f317b0eec338157b8d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal" Apr 16 20:11:57.919501 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:57.919328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/df511bf53f5da06c359fd97d0761730a-config\") pod \"kube-apiserver-proxy-ip-10-0-137-142.ec2.internal\" (UID: \"df511bf53f5da06c359fd97d0761730a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-142.ec2.internal" Apr 16 20:11:58.004604 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:58.004572 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-142.ec2.internal\" not found" Apr 16 20:11:58.019863 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.019784 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/80a7b203f53c53f317b0eec338157b8d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal\" (UID: \"80a7b203f53c53f317b0eec338157b8d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal" Apr 16 20:11:58.019863 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.019815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/80a7b203f53c53f317b0eec338157b8d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal\" (UID: \"80a7b203f53c53f317b0eec338157b8d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal" Apr 16 20:11:58.019863 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.019833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/df511bf53f5da06c359fd97d0761730a-config\") pod \"kube-apiserver-proxy-ip-10-0-137-142.ec2.internal\" (UID: \"df511bf53f5da06c359fd97d0761730a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-142.ec2.internal" Apr 16 20:11:58.020069 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.019896 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/80a7b203f53c53f317b0eec338157b8d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal\" (UID: \"80a7b203f53c53f317b0eec338157b8d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal" Apr 16 20:11:58.020069 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.019905 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/df511bf53f5da06c359fd97d0761730a-config\") pod \"kube-apiserver-proxy-ip-10-0-137-142.ec2.internal\" (UID: \"df511bf53f5da06c359fd97d0761730a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-142.ec2.internal" Apr 16 20:11:58.020069 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.019908 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/80a7b203f53c53f317b0eec338157b8d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal\" (UID: \"80a7b203f53c53f317b0eec338157b8d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal" Apr 16 20:11:58.105205 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:58.105160 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-142.ec2.internal\" not found" Apr 16 20:11:58.191726 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.191700 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal" Apr 16 20:11:58.197266 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.197249 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-142.ec2.internal" Apr 16 20:11:58.205833 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:58.205816 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-142.ec2.internal\" not found" Apr 16 20:11:58.306432 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:58.306316 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-142.ec2.internal\" not found" Apr 16 20:11:58.406917 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:58.406888 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-142.ec2.internal\" not found" Apr 16 20:11:58.507434 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:58.507392 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-142.ec2.internal\" not found" Apr 16 20:11:58.521708 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.521679 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:11:58.521846 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.521822 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:58.521898 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.521852 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:58.564940 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.564871 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:58.608178 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:58.608145 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-142.ec2.internal\" not found" Apr 16 20:11:58.610837 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.610803 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:06:57 +0000 UTC" deadline="2028-01-30 18:08:27.8052454 +0000 UTC" Apr 16 20:11:58.610837 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.610829 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15693h56m29.194419461s" Apr 16 20:11:58.616835 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.616817 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:58.629138 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.629114 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:58.631213 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.631193 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:58.654685 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.654663 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-b4rqw" Apr 16 20:11:58.662962 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.662942 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-b4rqw" Apr 16 20:11:58.694193 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:58.694167 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf511bf53f5da06c359fd97d0761730a.slice/crio-a83c23eafd03a12c4dd2c020f240a4261ba61902ca747897a81c5fd5fa42c206 WatchSource:0}: Error finding container a83c23eafd03a12c4dd2c020f240a4261ba61902ca747897a81c5fd5fa42c206: Status 404 returned error can't find the container with id a83c23eafd03a12c4dd2c020f240a4261ba61902ca747897a81c5fd5fa42c206 Apr 16 20:11:58.694421 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:11:58.694399 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80a7b203f53c53f317b0eec338157b8d.slice/crio-afc1549f618b764e1e8857c6e08942eb56b5c5fcdcb1033ec54e05c1f55602ce WatchSource:0}: Error finding container afc1549f618b764e1e8857c6e08942eb56b5c5fcdcb1033ec54e05c1f55602ce: Status 404 returned error can't find the container with id afc1549f618b764e1e8857c6e08942eb56b5c5fcdcb1033ec54e05c1f55602ce Apr 16 20:11:58.700966 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.700945 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:11:58.717748 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.717729 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal" Apr 16 20:11:58.730337 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.730320 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:58.731143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.731132 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-142.ec2.internal" Apr 16 20:11:58.737828 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.737812 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:58.761796 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.761754 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal" event={"ID":"80a7b203f53c53f317b0eec338157b8d","Type":"ContainerStarted","Data":"afc1549f618b764e1e8857c6e08942eb56b5c5fcdcb1033ec54e05c1f55602ce"} Apr 16 20:11:58.762713 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:58.762693 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-142.ec2.internal" event={"ID":"df511bf53f5da06c359fd97d0761730a","Type":"ContainerStarted","Data":"a83c23eafd03a12c4dd2c020f240a4261ba61902ca747897a81c5fd5fa42c206"} Apr 16 20:11:59.591700 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.591665 2572 apiserver.go:52] "Watching apiserver" Apr 16 20:11:59.597908 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.597886 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:11:59.600045 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.600017 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal","openshift-multus/multus-additional-cni-plugins-zq5ft","openshift-multus/network-metrics-daemon-622d4","openshift-network-diagnostics/network-check-target-2xqx8","openshift-ovn-kubernetes/ovnkube-node-mgjgp","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf","openshift-cluster-node-tuning-operator/tuned-9r8n4","openshift-dns/node-resolver-fjbmm","openshift-image-registry/node-ca-r8gdb","openshift-multus/multus-hqvbt","openshift-network-operator/iptables-alerter-4g8jb","kube-system/konnectivity-agent-skfgt","kube-system/kube-apiserver-proxy-ip-10-0-137-142.ec2.internal"] Apr 16 20:11:59.601519 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.601500 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.603816 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.603794 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.603935 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.603925 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:59.604165 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.604152 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:59.604596 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.604581 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zmnkz\"" Apr 16 20:11:59.605119 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.605096 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:11:59.605216 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.605175 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:11:59.605216 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:59.605189 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:11:59.605322 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:59.605217 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:11:59.606169 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.606147 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zj42d\"" Apr 16 20:11:59.606254 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.606206 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:11:59.606315 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.606206 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:11:59.606754 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.606736 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.606866 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.606848 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:11:59.606946 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.606928 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:11:59.607002 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.606854 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:11:59.607842 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.607824 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.609154 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.609134 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fjbmm" Apr 16 20:11:59.609246 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.609208 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:11:59.609246 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.609228 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:11:59.610087 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.610066 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:11:59.610804 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.610370 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:11:59.611048 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.611021 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:11:59.611276 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.611256 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:11:59.611500 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.611479 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:11:59.611733 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.611714 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:11:59.612036 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.612014 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qhfrq\"" Apr 16 20:11:59.612286 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.612269 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:11:59.614706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.613130 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rn2s6\"" Apr 16 20:11:59.614706 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.613958 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r8gdb" Apr 16 20:11:59.614910 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.614884 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:11:59.615041 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.615024 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:11:59.615672 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.615655 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-v9s6x\"" Apr 16 20:11:59.615880 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.615866 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:11:59.615997 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.615972 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:11:59.615997 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.615991 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:11:59.616121 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.616088 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.616272 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.616246 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4g8jb" Apr 16 20:11:59.616462 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.616443 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mlz62\"" Apr 16 20:11:59.617514 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.617494 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-skfgt" Apr 16 20:11:59.618168 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.618151 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:11:59.618168 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.618167 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:11:59.618305 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.618185 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-c8fnz\"" Apr 16 20:11:59.618361 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.618346 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:11:59.618361 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.618351 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-4j5tt\"" Apr 16 20:11:59.618524 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.618511 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:59.618648 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.618633 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:59.619458 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.619438 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:11:59.620025 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.620006 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-99qj6\"" Apr 16 20:11:59.620474 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.620458 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:11:59.629355 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629330 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-etc-kubernetes\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.629444 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629368 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ecca5d6-192a-4507-b71b-c8d9e9099230-host-slash\") pod \"iptables-alerter-4g8jb\" (UID: \"2ecca5d6-192a-4507-b71b-c8d9e9099230\") " pod="openshift-network-operator/iptables-alerter-4g8jb" Apr 16 20:11:59.629499 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629460 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-sysctl-conf\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.629545 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629497 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h6ss\" (UniqueName: \"kubernetes.io/projected/d285ba82-dded-4707-87cb-35b755280286-kube-api-access-8h6ss\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:11:59.629545 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629525 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-systemd-units\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.629619 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629551 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-cni-binary-copy\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.629619 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629580 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-modprobe-d\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.629619 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-kubernetes\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.629756 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629627 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.629756 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-device-dir\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.629756 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629708 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kks8g\" (UniqueName: \"kubernetes.io/projected/b05f1f8c-380a-4929-8648-4854289d7f72-kube-api-access-kks8g\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.629756 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2n97\" (UniqueName: \"kubernetes.io/projected/8ee16202-241d-45ac-9219-3363704a708e-kube-api-access-j2n97\") pod \"node-resolver-fjbmm\" (UID: \"8ee16202-241d-45ac-9219-3363704a708e\") " pod="openshift-dns/node-resolver-fjbmm" Apr 16 20:11:59.629922 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629760 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqkfl\" (UniqueName: \"kubernetes.io/projected/bfde86ea-03d1-4cf4-90b7-76b04a98def5-kube-api-access-zqkfl\") pod \"node-ca-r8gdb\" (UID: \"bfde86ea-03d1-4cf4-90b7-76b04a98def5\") " pod="openshift-image-registry/node-ca-r8gdb" Apr 16 20:11:59.629922 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629796 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b1cd9c4-abb0-4659-b9b8-0b263412063c-ovnkube-script-lib\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.629922 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629823 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-multus-conf-dir\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.629922 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629909 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-multus-daemon-config\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.630052 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629934 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-host\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.630052 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629960 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.630052 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629982 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-run-ovn\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.630052 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.629996 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b1cd9c4-abb0-4659-b9b8-0b263412063c-ovnkube-config\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.630052 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630020 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-socket-dir\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.630052 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630041 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-sys-fs\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.630244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630071 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8ee16202-241d-45ac-9219-3363704a708e-hosts-file\") pod \"node-resolver-fjbmm\" (UID: \"8ee16202-241d-45ac-9219-3363704a708e\") " pod="openshift-dns/node-resolver-fjbmm" Apr 16 20:11:59.630244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630094 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-cnibin\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.630244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630112 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-run-multus-certs\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.630244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630126 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-sysctl-d\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.630244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-tmp\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.630244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grs8d\" (UniqueName: \"kubernetes.io/projected/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-kube-api-access-grs8d\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.630244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-etc-openvswitch\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.630244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-kubelet\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.630244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630211 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-cni-netd\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.630244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630238 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhkv4\" (UniqueName: \"kubernetes.io/projected/2ecca5d6-192a-4507-b71b-c8d9e9099230-kube-api-access-jhkv4\") pod \"iptables-alerter-4g8jb\" (UID: \"2ecca5d6-192a-4507-b71b-c8d9e9099230\") " pod="openshift-network-operator/iptables-alerter-4g8jb" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630263 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-etc-selinux\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630284 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-run-openvswitch\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630304 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-node-log\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-log-socket\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630347 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-multus-cni-dir\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630367 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-var-lib-cni-bin\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2ecca5d6-192a-4507-b71b-c8d9e9099230-iptables-alerter-script\") pod \"iptables-alerter-4g8jb\" (UID: \"2ecca5d6-192a-4507-b71b-c8d9e9099230\") " pod="openshift-network-operator/iptables-alerter-4g8jb" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630438 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-registration-dir\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-run-netns\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630482 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m8ws\" (UniqueName: \"kubernetes.io/projected/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-kube-api-access-6m8ws\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630497 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-sys\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-var-lib-kubelet\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630530 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4zqd\" (UniqueName: \"kubernetes.io/projected/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-kube-api-access-r4zqd\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630549 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9clf\" (UniqueName: \"kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf\") pod \"network-check-target-2xqx8\" (UID: \"5ea99809-04f6-4ff1-adef-7bf9eb98c772\") " pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630570 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-cni-bin\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.630645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-var-lib-cni-multus\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630622 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-var-lib-kubelet\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630644 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-hostroot\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630659 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-sysconfig\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630674 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-system-cni-dir\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfde86ea-03d1-4cf4-90b7-76b04a98def5-host\") pod \"node-ca-r8gdb\" (UID: \"bfde86ea-03d1-4cf4-90b7-76b04a98def5\") " pod="openshift-image-registry/node-ca-r8gdb" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-run-systemd\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630724 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-var-lib-openvswitch\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630738 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b1cd9c4-abb0-4659-b9b8-0b263412063c-ovn-node-metrics-cert\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630753 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52hv2\" (UniqueName: \"kubernetes.io/projected/6b1cd9c4-abb0-4659-b9b8-0b263412063c-kube-api-access-52hv2\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630765 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-run\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630784 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-lib-modules\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630809 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ee16202-241d-45ac-9219-3363704a708e-tmp-dir\") pod \"node-resolver-fjbmm\" (UID: \"8ee16202-241d-45ac-9219-3363704a708e\") " pod="openshift-dns/node-resolver-fjbmm" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630825 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-slash\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-run-ovn-kubernetes\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630856 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b1cd9c4-abb0-4659-b9b8-0b263412063c-env-overrides\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630877 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f067f531-c6e2-4ac4-a01e-8aa1872f6296-agent-certs\") pod \"konnectivity-agent-skfgt\" (UID: \"f067f531-c6e2-4ac4-a01e-8aa1872f6296\") " pod="kube-system/konnectivity-agent-skfgt" Apr 16 20:11:59.631251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630899 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bfde86ea-03d1-4cf4-90b7-76b04a98def5-serviceca\") pod \"node-ca-r8gdb\" (UID: \"bfde86ea-03d1-4cf4-90b7-76b04a98def5\") " pod="openshift-image-registry/node-ca-r8gdb" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630914 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630930 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-run-k8s-cni-cncf-io\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-cnibin\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.630966 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-cni-binary-copy\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.631002 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-run-netns\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.631029 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f067f531-c6e2-4ac4-a01e-8aa1872f6296-konnectivity-ca\") pod \"konnectivity-agent-skfgt\" (UID: \"f067f531-c6e2-4ac4-a01e-8aa1872f6296\") " pod="kube-system/konnectivity-agent-skfgt" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.631051 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-systemd\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.631066 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-tuned\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.631082 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.631098 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.631114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-system-cni-dir\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.631135 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-os-release\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.631151 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.631165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-os-release\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.631930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.631187 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-multus-socket-dir-parent\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.663942 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.663908 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:58 +0000 UTC" deadline="2027-09-22 05:16:03.6708961 +0000 UTC" Apr 16 20:11:59.663942 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.663941 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12561h4m4.006958466s" Apr 16 20:11:59.723178 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.723146 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:59.731978 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.731953 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-etc-selinux\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.731978 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.731982 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-run-openvswitch\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.732174 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732003 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-node-log\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.732174 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732026 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-log-socket\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.732174 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-multus-cni-dir\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.732174 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732066 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-var-lib-cni-bin\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.732174 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2ecca5d6-192a-4507-b71b-c8d9e9099230-iptables-alerter-script\") pod \"iptables-alerter-4g8jb\" (UID: \"2ecca5d6-192a-4507-b71b-c8d9e9099230\") " pod="openshift-network-operator/iptables-alerter-4g8jb" Apr 16 20:11:59.732174 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-registration-dir\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.732174 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732126 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-etc-selinux\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.732174 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732135 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-run-netns\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.732174 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6m8ws\" (UniqueName: \"kubernetes.io/projected/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-kube-api-access-6m8ws\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732176 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-log-socket\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-run-openvswitch\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-sys\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732227 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-sys\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732241 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-var-lib-kubelet\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4zqd\" (UniqueName: \"kubernetes.io/projected/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-kube-api-access-r4zqd\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732278 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-registration-dir\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9clf\" (UniqueName: \"kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf\") pod \"network-check-target-2xqx8\" (UID: \"5ea99809-04f6-4ff1-adef-7bf9eb98c772\") " pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732312 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-run-netns\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732323 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-cni-bin\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-var-lib-cni-multus\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-var-lib-kubelet\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732441 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-var-lib-cni-multus\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-cni-bin\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-var-lib-cni-bin\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732540 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-hostroot\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732562 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-sysconfig\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.732622 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732583 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-var-lib-kubelet\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732586 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-system-cni-dir\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732596 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-hostroot\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732464 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-node-log\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732630 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-sysconfig\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-system-cni-dir\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732661 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-var-lib-kubelet\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732687 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfde86ea-03d1-4cf4-90b7-76b04a98def5-host\") pod \"node-ca-r8gdb\" (UID: \"bfde86ea-03d1-4cf4-90b7-76b04a98def5\") " pod="openshift-image-registry/node-ca-r8gdb" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732697 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-multus-cni-dir\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732713 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-run-systemd\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-var-lib-openvswitch\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfde86ea-03d1-4cf4-90b7-76b04a98def5-host\") pod \"node-ca-r8gdb\" (UID: \"bfde86ea-03d1-4cf4-90b7-76b04a98def5\") " pod="openshift-image-registry/node-ca-r8gdb" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732763 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2ecca5d6-192a-4507-b71b-c8d9e9099230-iptables-alerter-script\") pod \"iptables-alerter-4g8jb\" (UID: \"2ecca5d6-192a-4507-b71b-c8d9e9099230\") " pod="openshift-network-operator/iptables-alerter-4g8jb" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732765 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b1cd9c4-abb0-4659-b9b8-0b263412063c-ovn-node-metrics-cert\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732786 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-var-lib-openvswitch\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732787 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-run-systemd\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52hv2\" (UniqueName: \"kubernetes.io/projected/6b1cd9c4-abb0-4659-b9b8-0b263412063c-kube-api-access-52hv2\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-run\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.733352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-lib-modules\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732868 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ee16202-241d-45ac-9219-3363704a708e-tmp-dir\") pod \"node-resolver-fjbmm\" (UID: \"8ee16202-241d-45ac-9219-3363704a708e\") " pod="openshift-dns/node-resolver-fjbmm" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-run\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732891 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-slash\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732936 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-run-ovn-kubernetes\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b1cd9c4-abb0-4659-b9b8-0b263412063c-env-overrides\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.732987 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f067f531-c6e2-4ac4-a01e-8aa1872f6296-agent-certs\") pod \"konnectivity-agent-skfgt\" (UID: \"f067f531-c6e2-4ac4-a01e-8aa1872f6296\") " pod="kube-system/konnectivity-agent-skfgt" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733001 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-lib-modules\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733009 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-slash\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733011 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bfde86ea-03d1-4cf4-90b7-76b04a98def5-serviceca\") pod \"node-ca-r8gdb\" (UID: \"bfde86ea-03d1-4cf4-90b7-76b04a98def5\") " pod="openshift-image-registry/node-ca-r8gdb" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733070 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-run-ovn-kubernetes\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733079 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-run-k8s-cni-cncf-io\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733108 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-cnibin\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-cni-binary-copy\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733136 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ee16202-241d-45ac-9219-3363704a708e-tmp-dir\") pod \"node-resolver-fjbmm\" (UID: \"8ee16202-241d-45ac-9219-3363704a708e\") " pod="openshift-dns/node-resolver-fjbmm" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733147 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-run-netns\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.734232 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733083 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733191 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f067f531-c6e2-4ac4-a01e-8aa1872f6296-konnectivity-ca\") pod \"konnectivity-agent-skfgt\" (UID: \"f067f531-c6e2-4ac4-a01e-8aa1872f6296\") " pod="kube-system/konnectivity-agent-skfgt" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733197 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-run-k8s-cni-cncf-io\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733217 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-systemd\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733226 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-run-netns\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733242 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-tuned\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733267 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733307 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-system-cni-dir\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733332 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-os-release\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733353 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733397 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-os-release\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733421 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-multus-socket-dir-parent\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733435 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bfde86ea-03d1-4cf4-90b7-76b04a98def5-serviceca\") pod \"node-ca-r8gdb\" (UID: \"bfde86ea-03d1-4cf4-90b7-76b04a98def5\") " pod="openshift-image-registry/node-ca-r8gdb" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733451 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-etc-kubernetes\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ecca5d6-192a-4507-b71b-c8d9e9099230-host-slash\") pod \"iptables-alerter-4g8jb\" (UID: \"2ecca5d6-192a-4507-b71b-c8d9e9099230\") " pod="openshift-network-operator/iptables-alerter-4g8jb" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733499 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-sysctl-conf\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733522 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8h6ss\" (UniqueName: \"kubernetes.io/projected/d285ba82-dded-4707-87cb-35b755280286-kube-api-access-8h6ss\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:11:59.735059 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:59.733530 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733538 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-systemd-units\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733561 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-cni-binary-copy\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:59.733614 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs podName:d285ba82-dded-4707-87cb-35b755280286 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.233574495 +0000 UTC m=+3.074474877 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs") pod "network-metrics-daemon-622d4" (UID: "d285ba82-dded-4707-87cb-35b755280286") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b1cd9c4-abb0-4659-b9b8-0b263412063c-env-overrides\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733650 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-modprobe-d\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-kubernetes\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733708 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-cni-binary-copy\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733737 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-device-dir\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733772 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kks8g\" (UniqueName: \"kubernetes.io/projected/b05f1f8c-380a-4929-8648-4854289d7f72-kube-api-access-kks8g\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733783 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-device-dir\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2n97\" (UniqueName: \"kubernetes.io/projected/8ee16202-241d-45ac-9219-3363704a708e-kube-api-access-j2n97\") pod \"node-resolver-fjbmm\" (UID: \"8ee16202-241d-45ac-9219-3363704a708e\") " pod="openshift-dns/node-resolver-fjbmm" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqkfl\" (UniqueName: \"kubernetes.io/projected/bfde86ea-03d1-4cf4-90b7-76b04a98def5-kube-api-access-zqkfl\") pod \"node-ca-r8gdb\" (UID: \"bfde86ea-03d1-4cf4-90b7-76b04a98def5\") " pod="openshift-image-registry/node-ca-r8gdb" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b1cd9c4-abb0-4659-b9b8-0b263412063c-ovnkube-script-lib\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733863 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-multus-conf-dir\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.735829 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733891 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-multus-daemon-config\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-host\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733954 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-run-ovn\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b1cd9c4-abb0-4659-b9b8-0b263412063c-ovnkube-config\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.733999 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-socket-dir\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734022 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-sys-fs\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734038 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-system-cni-dir\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734088 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-cnibin\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734096 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-etc-kubernetes\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734115 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ecca5d6-192a-4507-b71b-c8d9e9099230-host-slash\") pod \"iptables-alerter-4g8jb\" (UID: \"2ecca5d6-192a-4507-b71b-c8d9e9099230\") " pod="openshift-network-operator/iptables-alerter-4g8jb" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734184 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f067f531-c6e2-4ac4-a01e-8aa1872f6296-konnectivity-ca\") pod \"konnectivity-agent-skfgt\" (UID: \"f067f531-c6e2-4ac4-a01e-8aa1872f6296\") " pod="kube-system/konnectivity-agent-skfgt" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734248 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-systemd\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734255 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-systemd-units\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8ee16202-241d-45ac-9219-3363704a708e-hosts-file\") pod \"node-resolver-fjbmm\" (UID: \"8ee16202-241d-45ac-9219-3363704a708e\") " pod="openshift-dns/node-resolver-fjbmm" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-cnibin\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734311 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-sysctl-conf\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734324 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-run-multus-certs\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.736321 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734363 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-os-release\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734365 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-run-ovn\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734397 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-cnibin\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734465 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-kubernetes\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734468 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-multus-conf-dir\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734480 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-multus-socket-dir-parent\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734471 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-modprobe-d\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734502 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-os-release\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734681 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8ee16202-241d-45ac-9219-3363704a708e-hosts-file\") pod \"node-resolver-fjbmm\" (UID: \"8ee16202-241d-45ac-9219-3363704a708e\") " pod="openshift-dns/node-resolver-fjbmm" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-host-run-multus-certs\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734753 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-sys-fs\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-sysctl-d\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734805 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-tmp\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734831 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grs8d\" (UniqueName: \"kubernetes.io/projected/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-kube-api-access-grs8d\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-etc-openvswitch\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734876 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-kubelet\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-cni-netd\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.737117 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.734920 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhkv4\" (UniqueName: \"kubernetes.io/projected/2ecca5d6-192a-4507-b71b-c8d9e9099230-kube-api-access-jhkv4\") pod \"iptables-alerter-4g8jb\" (UID: \"2ecca5d6-192a-4507-b71b-c8d9e9099230\") " pod="openshift-network-operator/iptables-alerter-4g8jb" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.735017 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.735102 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-multus-daemon-config\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.735247 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b05f1f8c-380a-4929-8648-4854289d7f72-socket-dir\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.735407 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.735460 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-kubelet\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.735499 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-host-cni-netd\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.735586 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b1cd9c4-abb0-4659-b9b8-0b263412063c-etc-openvswitch\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.735602 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b1cd9c4-abb0-4659-b9b8-0b263412063c-ovnkube-script-lib\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.735636 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.735681 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-sysctl-d\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.735700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-host\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.736011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b1cd9c4-abb0-4659-b9b8-0b263412063c-ovnkube-config\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.736214 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-cni-binary-copy\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.737566 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b1cd9c4-abb0-4659-b9b8-0b263412063c-ovn-node-metrics-cert\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.737695 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f067f531-c6e2-4ac4-a01e-8aa1872f6296-agent-certs\") pod \"konnectivity-agent-skfgt\" (UID: \"f067f531-c6e2-4ac4-a01e-8aa1872f6296\") " pod="kube-system/konnectivity-agent-skfgt" Apr 16 20:11:59.737779 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.737733 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-tmp\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.738393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.737911 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-etc-tuned\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.738393 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:59.738069 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:59.738393 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:59.738088 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:59.738393 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:59.738101 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h9clf for pod openshift-network-diagnostics/network-check-target-2xqx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:59.738393 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:11:59.738174 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf podName:5ea99809-04f6-4ff1-adef-7bf9eb98c772 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.238157065 +0000 UTC m=+3.079057450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h9clf" (UniqueName: "kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf") pod "network-check-target-2xqx8" (UID: "5ea99809-04f6-4ff1-adef-7bf9eb98c772") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:59.740572 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.740552 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4zqd\" (UniqueName: \"kubernetes.io/projected/f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66-kube-api-access-r4zqd\") pod \"tuned-9r8n4\" (UID: \"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66\") " pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.741248 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.741196 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m8ws\" (UniqueName: \"kubernetes.io/projected/46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3-kube-api-access-6m8ws\") pod \"multus-hqvbt\" (UID: \"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3\") " pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.741248 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.741237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52hv2\" (UniqueName: \"kubernetes.io/projected/6b1cd9c4-abb0-4659-b9b8-0b263412063c-kube-api-access-52hv2\") pod \"ovnkube-node-mgjgp\" (UID: \"6b1cd9c4-abb0-4659-b9b8-0b263412063c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.743087 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.743044 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h6ss\" (UniqueName: \"kubernetes.io/projected/d285ba82-dded-4707-87cb-35b755280286-kube-api-access-8h6ss\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:11:59.743520 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.743496 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2n97\" (UniqueName: \"kubernetes.io/projected/8ee16202-241d-45ac-9219-3363704a708e-kube-api-access-j2n97\") pod \"node-resolver-fjbmm\" (UID: \"8ee16202-241d-45ac-9219-3363704a708e\") " pod="openshift-dns/node-resolver-fjbmm" Apr 16 20:11:59.744133 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.744108 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kks8g\" (UniqueName: \"kubernetes.io/projected/b05f1f8c-380a-4929-8648-4854289d7f72-kube-api-access-kks8g\") pod \"aws-ebs-csi-driver-node-mghnf\" (UID: \"b05f1f8c-380a-4929-8648-4854289d7f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.744768 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.744728 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhkv4\" (UniqueName: \"kubernetes.io/projected/2ecca5d6-192a-4507-b71b-c8d9e9099230-kube-api-access-jhkv4\") pod \"iptables-alerter-4g8jb\" (UID: \"2ecca5d6-192a-4507-b71b-c8d9e9099230\") " pod="openshift-network-operator/iptables-alerter-4g8jb" Apr 16 20:11:59.745095 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.745077 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grs8d\" (UniqueName: \"kubernetes.io/projected/10f2b7ab-1884-4d1c-8207-ad7844c2b18f-kube-api-access-grs8d\") pod \"multus-additional-cni-plugins-zq5ft\" (UID: \"10f2b7ab-1884-4d1c-8207-ad7844c2b18f\") " pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.745651 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.745630 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqkfl\" (UniqueName: \"kubernetes.io/projected/bfde86ea-03d1-4cf4-90b7-76b04a98def5-kube-api-access-zqkfl\") pod \"node-ca-r8gdb\" (UID: \"bfde86ea-03d1-4cf4-90b7-76b04a98def5\") " pod="openshift-image-registry/node-ca-r8gdb" Apr 16 20:11:59.916061 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.915972 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" Apr 16 20:11:59.924012 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.923991 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zq5ft" Apr 16 20:11:59.932639 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.932620 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:11:59.938330 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.938310 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" Apr 16 20:11:59.943871 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.943851 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fjbmm" Apr 16 20:11:59.950445 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.950426 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r8gdb" Apr 16 20:11:59.956138 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.956121 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hqvbt" Apr 16 20:11:59.963642 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.963624 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4g8jb" Apr 16 20:11:59.970207 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:11:59.970185 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-skfgt" Apr 16 20:12:00.008196 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.008171 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:12:00.238687 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.238604 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9clf\" (UniqueName: \"kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf\") pod \"network-check-target-2xqx8\" (UID: \"5ea99809-04f6-4ff1-adef-7bf9eb98c772\") " pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:00.238687 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.238646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:00.238872 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:00.238727 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:00.238872 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:00.238738 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:00.238872 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:00.238757 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:00.238872 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:00.238766 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h9clf for pod openshift-network-diagnostics/network-check-target-2xqx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:00.238872 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:00.238774 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs podName:d285ba82-dded-4707-87cb-35b755280286 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:01.238760766 +0000 UTC m=+4.079661133 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs") pod "network-metrics-daemon-622d4" (UID: "d285ba82-dded-4707-87cb-35b755280286") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:00.238872 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:00.238807 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf podName:5ea99809-04f6-4ff1-adef-7bf9eb98c772 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:01.238795652 +0000 UTC m=+4.079696019 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-h9clf" (UniqueName: "kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf") pod "network-check-target-2xqx8" (UID: "5ea99809-04f6-4ff1-adef-7bf9eb98c772") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:00.318392 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:12:00.318349 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb05f1f8c_380a_4929_8648_4854289d7f72.slice/crio-34a8b541409bd70b3d3d68907c300442f4bcdb9d30271332ba650a917cefe342 WatchSource:0}: Error finding container 34a8b541409bd70b3d3d68907c300442f4bcdb9d30271332ba650a917cefe342: Status 404 returned error can't find the container with id 34a8b541409bd70b3d3d68907c300442f4bcdb9d30271332ba650a917cefe342 Apr 16 20:12:00.324985 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:12:00.324959 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ecca5d6_192a_4507_b71b_c8d9e9099230.slice/crio-1bf810a869817d5e66c1b95c5f867f9fd84d257bedb3caf7784abaa2cfcb7818 WatchSource:0}: Error finding container 1bf810a869817d5e66c1b95c5f867f9fd84d257bedb3caf7784abaa2cfcb7818: Status 404 returned error can't find the container with id 1bf810a869817d5e66c1b95c5f867f9fd84d257bedb3caf7784abaa2cfcb7818 Apr 16 20:12:00.664545 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.664257 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:58 +0000 UTC" deadline="2027-12-08 14:06:38.476996195 +0000 UTC" Apr 16 20:12:00.664545 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.664491 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14417h54m37.812513408s" Apr 16 20:12:00.759766 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.759095 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:00.759766 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:00.759232 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:00.774944 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.774907 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" event={"ID":"6b1cd9c4-abb0-4659-b9b8-0b263412063c","Type":"ContainerStarted","Data":"446f3ae0a9bc8cc96ad6afc781459a1df6d38e87f92a5ccfe5c0154cdda32ad0"} Apr 16 20:12:00.780711 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.780674 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hqvbt" event={"ID":"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3","Type":"ContainerStarted","Data":"7182a01c2afe621531bf889302e6d134117979b54d55fd4bd23f4b256791a809"} Apr 16 20:12:00.784075 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.784023 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r8gdb" event={"ID":"bfde86ea-03d1-4cf4-90b7-76b04a98def5","Type":"ContainerStarted","Data":"486ec9ed73b6d007f66c484d263df3b609b995cdea12c0e66959233f21f0fb85"} Apr 16 20:12:00.788317 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.788267 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4g8jb" event={"ID":"2ecca5d6-192a-4507-b71b-c8d9e9099230","Type":"ContainerStarted","Data":"1bf810a869817d5e66c1b95c5f867f9fd84d257bedb3caf7784abaa2cfcb7818"} Apr 16 20:12:00.795482 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.795444 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fjbmm" event={"ID":"8ee16202-241d-45ac-9219-3363704a708e","Type":"ContainerStarted","Data":"7e4513da4aeb92a7b99f0da9a55d85d6c5854e1b49df3ad88fccaee6ec4cf7f9"} Apr 16 20:12:00.810918 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.810890 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" event={"ID":"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66","Type":"ContainerStarted","Data":"2bf633301238b934fe0b3b125ad72f84f647fa38fea1031209453c1ad945b0fb"} Apr 16 20:12:00.822471 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.822442 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-142.ec2.internal" event={"ID":"df511bf53f5da06c359fd97d0761730a","Type":"ContainerStarted","Data":"844104a97643343af7577125a5ed735c628520a8d6d3c02dfcb0c0ee725f746f"} Apr 16 20:12:00.833005 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.832904 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5ft" event={"ID":"10f2b7ab-1884-4d1c-8207-ad7844c2b18f","Type":"ContainerStarted","Data":"82f824501c5cddfa0f7bfc015dcbcb3b41610957490cd7ab9f6ddb15bda49476"} Apr 16 20:12:00.848103 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.848068 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-skfgt" event={"ID":"f067f531-c6e2-4ac4-a01e-8aa1872f6296","Type":"ContainerStarted","Data":"2e0755606d8c69f52b7de27df1d23bda19c3ed125654aa781ac44a7e66ee7d3b"} Apr 16 20:12:00.852211 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:00.851857 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" event={"ID":"b05f1f8c-380a-4929-8648-4854289d7f72","Type":"ContainerStarted","Data":"34a8b541409bd70b3d3d68907c300442f4bcdb9d30271332ba650a917cefe342"} Apr 16 20:12:01.248224 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:01.248181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9clf\" (UniqueName: \"kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf\") pod \"network-check-target-2xqx8\" (UID: \"5ea99809-04f6-4ff1-adef-7bf9eb98c772\") " pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:01.248422 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:01.248248 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:01.248422 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:01.248406 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:01.248544 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:01.248467 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs podName:d285ba82-dded-4707-87cb-35b755280286 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:03.248448745 +0000 UTC m=+6.089349118 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs") pod "network-metrics-daemon-622d4" (UID: "d285ba82-dded-4707-87cb-35b755280286") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:01.248863 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:01.248845 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:01.248863 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:01.248865 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:01.249014 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:01.248879 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h9clf for pod openshift-network-diagnostics/network-check-target-2xqx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:01.249014 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:01.248926 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf podName:5ea99809-04f6-4ff1-adef-7bf9eb98c772 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:03.248909664 +0000 UTC m=+6.089810033 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-h9clf" (UniqueName: "kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf") pod "network-check-target-2xqx8" (UID: "5ea99809-04f6-4ff1-adef-7bf9eb98c772") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:01.760876 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:01.760842 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:01.761312 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:01.760963 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:01.863363 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:01.863260 2572 generic.go:358] "Generic (PLEG): container finished" podID="80a7b203f53c53f317b0eec338157b8d" containerID="22bf5954902002dbc5207cd9c8625d9d1175012df9c1069f849ba93ceb5d18a4" exitCode=0 Apr 16 20:12:01.864361 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:01.864332 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal" event={"ID":"80a7b203f53c53f317b0eec338157b8d","Type":"ContainerDied","Data":"22bf5954902002dbc5207cd9c8625d9d1175012df9c1069f849ba93ceb5d18a4"} Apr 16 20:12:01.877940 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:01.877809 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-142.ec2.internal" podStartSLOduration=3.877788106 podStartE2EDuration="3.877788106s" podCreationTimestamp="2026-04-16 20:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:00.837253312 +0000 UTC m=+3.678153704" watchObservedRunningTime="2026-04-16 20:12:01.877788106 +0000 UTC m=+4.718688497" Apr 16 20:12:02.758909 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:02.758882 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:02.759120 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:02.759017 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:02.879552 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:02.879506 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal" event={"ID":"80a7b203f53c53f317b0eec338157b8d","Type":"ContainerStarted","Data":"2ce416eef2e650175b534256b6de925122ac91feee25752cb551f6dfbc38da5f"} Apr 16 20:12:02.893931 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:02.893842 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-142.ec2.internal" podStartSLOduration=4.89382538 podStartE2EDuration="4.89382538s" podCreationTimestamp="2026-04-16 20:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:02.892816933 +0000 UTC m=+5.733717325" watchObservedRunningTime="2026-04-16 20:12:02.89382538 +0000 UTC m=+5.734725776" Apr 16 20:12:03.264288 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:03.264229 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9clf\" (UniqueName: \"kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf\") pod \"network-check-target-2xqx8\" (UID: \"5ea99809-04f6-4ff1-adef-7bf9eb98c772\") " pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:03.264504 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:03.264301 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:03.264504 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:03.264444 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:03.264619 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:03.264513 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs podName:d285ba82-dded-4707-87cb-35b755280286 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:07.264490974 +0000 UTC m=+10.105391349 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs") pod "network-metrics-daemon-622d4" (UID: "d285ba82-dded-4707-87cb-35b755280286") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:03.264972 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:03.264948 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:03.265055 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:03.264976 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:03.265055 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:03.264990 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h9clf for pod openshift-network-diagnostics/network-check-target-2xqx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:03.265055 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:03.265034 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf podName:5ea99809-04f6-4ff1-adef-7bf9eb98c772 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:07.265023347 +0000 UTC m=+10.105923715 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-h9clf" (UniqueName: "kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf") pod "network-check-target-2xqx8" (UID: "5ea99809-04f6-4ff1-adef-7bf9eb98c772") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:03.759446 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:03.758834 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:03.759446 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:03.758957 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:04.759295 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:04.758796 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:04.759295 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:04.758936 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:05.760875 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:05.760838 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:05.761288 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:05.760964 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:06.759077 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:06.759049 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:06.759250 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:06.759179 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:07.298188 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:07.298149 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9clf\" (UniqueName: \"kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf\") pod \"network-check-target-2xqx8\" (UID: \"5ea99809-04f6-4ff1-adef-7bf9eb98c772\") " pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:07.298607 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:07.298219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:07.298607 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:07.298363 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:07.298607 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:07.298448 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs podName:d285ba82-dded-4707-87cb-35b755280286 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:15.298427501 +0000 UTC m=+18.139327877 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs") pod "network-metrics-daemon-622d4" (UID: "d285ba82-dded-4707-87cb-35b755280286") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:07.298878 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:07.298860 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:07.298967 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:07.298884 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:07.298967 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:07.298896 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h9clf for pod openshift-network-diagnostics/network-check-target-2xqx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:07.298967 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:07.298945 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf podName:5ea99809-04f6-4ff1-adef-7bf9eb98c772 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:15.298929571 +0000 UTC m=+18.139829942 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-h9clf" (UniqueName: "kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf") pod "network-check-target-2xqx8" (UID: "5ea99809-04f6-4ff1-adef-7bf9eb98c772") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:07.762979 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:07.762890 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:07.763139 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:07.763039 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:08.758570 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:08.758534 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:08.759124 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:08.758679 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:09.758923 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:09.758899 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:09.759279 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:09.759037 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:10.758454 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:10.758417 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:10.758646 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:10.758557 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:11.758610 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:11.758576 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:11.759046 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:11.758708 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:12.758443 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:12.758408 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:12.758611 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:12.758548 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:13.758348 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:13.758313 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:13.758796 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:13.758445 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:14.758699 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:14.758668 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:14.759157 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:14.758806 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:15.354343 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:15.354304 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9clf\" (UniqueName: \"kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf\") pod \"network-check-target-2xqx8\" (UID: \"5ea99809-04f6-4ff1-adef-7bf9eb98c772\") " pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:15.354558 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:15.354365 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:15.354558 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:15.354482 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:15.354558 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:15.354503 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:15.354558 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:15.354529 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:15.354558 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:15.354542 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h9clf for pod openshift-network-diagnostics/network-check-target-2xqx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:15.354767 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:15.354542 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs podName:d285ba82-dded-4707-87cb-35b755280286 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:31.354525325 +0000 UTC m=+34.195425693 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs") pod "network-metrics-daemon-622d4" (UID: "d285ba82-dded-4707-87cb-35b755280286") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:15.354767 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:15.354605 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf podName:5ea99809-04f6-4ff1-adef-7bf9eb98c772 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:31.354588118 +0000 UTC m=+34.195488497 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-h9clf" (UniqueName: "kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf") pod "network-check-target-2xqx8" (UID: "5ea99809-04f6-4ff1-adef-7bf9eb98c772") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:15.758367 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:15.758287 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:15.758547 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:15.758436 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:16.758268 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:16.758223 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:16.758664 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:16.758372 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:17.759143 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:17.759106 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:17.759625 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:17.759195 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:18.759162 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.758857 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:18.759964 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:18.759248 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:18.908995 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.908903 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" event={"ID":"b05f1f8c-380a-4929-8648-4854289d7f72","Type":"ContainerStarted","Data":"f805844f49ebf74121d8584c29a7fc9c06a5f4b5c4fd11f4222ef6416e776e62"} Apr 16 20:12:18.911643 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.911618 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" event={"ID":"6b1cd9c4-abb0-4659-b9b8-0b263412063c","Type":"ContainerStarted","Data":"e0f1c1f0f42990baead002b5dc8fcc62b387a3c5bbd81b9eef012364aa35bfa9"} Apr 16 20:12:18.911775 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.911656 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" event={"ID":"6b1cd9c4-abb0-4659-b9b8-0b263412063c","Type":"ContainerStarted","Data":"1e1730c0e1a9e334040227d7ab4e1d2290314a44d8656540dc9232cc2d308d66"} Apr 16 20:12:18.911775 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.911671 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" event={"ID":"6b1cd9c4-abb0-4659-b9b8-0b263412063c","Type":"ContainerStarted","Data":"515e4f83409487f4176446b40d9d4f00fa9807f0367ef5547cc96ed1ad90f3d2"} Apr 16 20:12:18.911775 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.911684 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" event={"ID":"6b1cd9c4-abb0-4659-b9b8-0b263412063c","Type":"ContainerStarted","Data":"91bec05c8a03230cbe6b073666e6d32e14be0d4e5d209f082bfca97a2daa5f12"} Apr 16 20:12:18.911775 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.911696 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" event={"ID":"6b1cd9c4-abb0-4659-b9b8-0b263412063c","Type":"ContainerStarted","Data":"837772ed4a62b1ca782e5ae808dbb7a660ade817f225c671481bfbfab11dd215"} Apr 16 20:12:18.911775 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.911709 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" event={"ID":"6b1cd9c4-abb0-4659-b9b8-0b263412063c","Type":"ContainerStarted","Data":"d6371722c5f84162067713fadf2afd671f040333885211ea00eeade4e71aee6a"} Apr 16 20:12:18.913009 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.912986 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hqvbt" event={"ID":"46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3","Type":"ContainerStarted","Data":"64f591006ef19eb8c431384f54148b4269ba1157870c99404fac6b6c7d83933f"} Apr 16 20:12:18.914266 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.914242 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r8gdb" event={"ID":"bfde86ea-03d1-4cf4-90b7-76b04a98def5","Type":"ContainerStarted","Data":"f24c3428afd575f6b04c9dd66528c0e1d5f68b6fa12bbe67b881f2f6ff92d678"} Apr 16 20:12:18.915577 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.915557 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fjbmm" event={"ID":"8ee16202-241d-45ac-9219-3363704a708e","Type":"ContainerStarted","Data":"9d89246e56902b78167e8d4d5c89635f4fbc914d2e9cc84648cc2095062e9eaf"} Apr 16 20:12:18.916931 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.916907 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" event={"ID":"f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66","Type":"ContainerStarted","Data":"48c02a0a43acccbb24846f5cc22a9ffb736cc9f885852e53da46f5dc17915ade"} Apr 16 20:12:18.918274 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.918253 2572 generic.go:358] "Generic (PLEG): container finished" podID="10f2b7ab-1884-4d1c-8207-ad7844c2b18f" containerID="34f97be06be1c07ea92dcbcc556ca0237444092eacea456a4abb81fb9d73ab1f" exitCode=0 Apr 16 20:12:18.918369 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.918307 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5ft" event={"ID":"10f2b7ab-1884-4d1c-8207-ad7844c2b18f","Type":"ContainerDied","Data":"34f97be06be1c07ea92dcbcc556ca0237444092eacea456a4abb81fb9d73ab1f"} Apr 16 20:12:18.919612 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.919586 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-skfgt" event={"ID":"f067f531-c6e2-4ac4-a01e-8aa1872f6296","Type":"ContainerStarted","Data":"491c3164540d7ec54d28ac9aa3df28d807ee90a2151b3f7b70907ef776f436af"} Apr 16 20:12:18.933210 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.933168 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hqvbt" podStartSLOduration=4.184403677 podStartE2EDuration="21.933152893s" podCreationTimestamp="2026-04-16 20:11:57 +0000 UTC" firstStartedPulling="2026-04-16 20:12:00.332557101 +0000 UTC m=+3.173457483" lastFinishedPulling="2026-04-16 20:12:18.081306318 +0000 UTC m=+20.922206699" observedRunningTime="2026-04-16 20:12:18.932709201 +0000 UTC m=+21.773609591" watchObservedRunningTime="2026-04-16 20:12:18.933152893 +0000 UTC m=+21.774053286" Apr 16 20:12:18.972607 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:18.972553 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fjbmm" podStartSLOduration=4.221055113 podStartE2EDuration="21.972534998s" podCreationTimestamp="2026-04-16 20:11:57 +0000 UTC" firstStartedPulling="2026-04-16 20:12:00.325951755 +0000 UTC m=+3.166852123" lastFinishedPulling="2026-04-16 20:12:18.07743164 +0000 UTC m=+20.918332008" observedRunningTime="2026-04-16 20:12:18.972472835 +0000 UTC m=+21.813373225" watchObservedRunningTime="2026-04-16 20:12:18.972534998 +0000 UTC m=+21.813435389" Apr 16 20:12:19.010651 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:19.010597 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9r8n4" podStartSLOduration=4.302522773 podStartE2EDuration="22.010579767s" podCreationTimestamp="2026-04-16 20:11:57 +0000 UTC" firstStartedPulling="2026-04-16 20:12:00.318639702 +0000 UTC m=+3.159540084" lastFinishedPulling="2026-04-16 20:12:18.026696694 +0000 UTC m=+20.867597078" observedRunningTime="2026-04-16 20:12:19.010278784 +0000 UTC m=+21.851179174" watchObservedRunningTime="2026-04-16 20:12:19.010579767 +0000 UTC m=+21.851480159" Apr 16 20:12:19.010830 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:19.010775 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-r8gdb" podStartSLOduration=4.319621938 podStartE2EDuration="22.010767763s" podCreationTimestamp="2026-04-16 20:11:57 +0000 UTC" firstStartedPulling="2026-04-16 20:12:00.335443909 +0000 UTC m=+3.176344277" lastFinishedPulling="2026-04-16 20:12:18.026589734 +0000 UTC m=+20.867490102" observedRunningTime="2026-04-16 20:12:18.990191804 +0000 UTC m=+21.831092194" watchObservedRunningTime="2026-04-16 20:12:19.010767763 +0000 UTC m=+21.851668157" Apr 16 20:12:19.029660 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:19.029611 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-skfgt" podStartSLOduration=4.338724197 podStartE2EDuration="22.029593926s" podCreationTimestamp="2026-04-16 20:11:57 +0000 UTC" firstStartedPulling="2026-04-16 20:12:00.335630501 +0000 UTC m=+3.176530873" lastFinishedPulling="2026-04-16 20:12:18.02650022 +0000 UTC m=+20.867400602" observedRunningTime="2026-04-16 20:12:19.028523676 +0000 UTC m=+21.869424066" watchObservedRunningTime="2026-04-16 20:12:19.029593926 +0000 UTC m=+21.870494319" Apr 16 20:12:19.245636 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:19.245603 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:12:19.690583 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:19.690426 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:12:19.245629359Z","UUID":"5d1dacec-c40d-4e60-bbd5-c414ab677874","Handler":null,"Name":"","Endpoint":""} Apr 16 20:12:19.693427 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:19.693400 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:12:19.693427 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:19.693433 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:12:19.760416 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:19.760363 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:19.760848 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:19.760498 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:19.923991 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:19.923949 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4g8jb" event={"ID":"2ecca5d6-192a-4507-b71b-c8d9e9099230","Type":"ContainerStarted","Data":"662f5739069447d7fa57461cb482f587bb313eb4cd2b5871084113647090d25a"} Apr 16 20:12:19.926578 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:19.926079 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" event={"ID":"b05f1f8c-380a-4929-8648-4854289d7f72","Type":"ContainerStarted","Data":"1d403169e5ab3da32df69556afb6f17fa4993d8b03a51a19814dfe8b1ed64e1f"} Apr 16 20:12:19.941120 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:19.941033 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4g8jb" podStartSLOduration=5.189859481 podStartE2EDuration="22.941017519s" podCreationTimestamp="2026-04-16 20:11:57 +0000 UTC" firstStartedPulling="2026-04-16 20:12:00.327709105 +0000 UTC m=+3.168609484" lastFinishedPulling="2026-04-16 20:12:18.078867148 +0000 UTC m=+20.919767522" observedRunningTime="2026-04-16 20:12:19.940560581 +0000 UTC m=+22.781460974" watchObservedRunningTime="2026-04-16 20:12:19.941017519 +0000 UTC m=+22.781917912" Apr 16 20:12:20.759173 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:20.759140 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:20.759363 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:20.759270 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:20.930780 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:20.930744 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" event={"ID":"b05f1f8c-380a-4929-8648-4854289d7f72","Type":"ContainerStarted","Data":"9bd002a59ff74879e823f4ebc2098f9b7a7ce35f04fd30f61f1a8ffbbd9db308"} Apr 16 20:12:20.934499 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:20.934456 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" event={"ID":"6b1cd9c4-abb0-4659-b9b8-0b263412063c","Type":"ContainerStarted","Data":"6366f58c0833238a57c41d1f8f4f1f910f4d76d0668910d1d59e39d968de8273"} Apr 16 20:12:20.948289 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:20.948237 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mghnf" podStartSLOduration=4.196425331 podStartE2EDuration="23.948218163s" podCreationTimestamp="2026-04-16 20:11:57 +0000 UTC" firstStartedPulling="2026-04-16 20:12:00.323764068 +0000 UTC m=+3.164664440" lastFinishedPulling="2026-04-16 20:12:20.075556897 +0000 UTC m=+22.916457272" observedRunningTime="2026-04-16 20:12:20.948181619 +0000 UTC m=+23.789082018" watchObservedRunningTime="2026-04-16 20:12:20.948218163 +0000 UTC m=+23.789118553" Apr 16 20:12:21.758460 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:21.758420 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:21.758646 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:21.758577 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:22.755814 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:22.755782 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-skfgt" Apr 16 20:12:22.756984 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:22.756961 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-skfgt" Apr 16 20:12:22.759089 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:22.759069 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:22.759183 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:22.759162 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:22.938684 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:22.938650 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-skfgt" Apr 16 20:12:22.939138 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:22.939116 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-skfgt" Apr 16 20:12:23.758554 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:23.758352 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:23.759525 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:23.758617 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:23.943548 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:23.943512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" event={"ID":"6b1cd9c4-abb0-4659-b9b8-0b263412063c","Type":"ContainerStarted","Data":"8dab29b4135b7d30c236a95436827e97066be2e6ed24f8de9bed2bbcf71dfb5d"} Apr 16 20:12:23.943822 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:23.943803 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:12:23.943920 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:23.943858 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:12:23.945195 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:23.945169 2572 generic.go:358] "Generic (PLEG): container finished" podID="10f2b7ab-1884-4d1c-8207-ad7844c2b18f" containerID="ff186bd789dc3527764755245f8e58461f0218c97be07454eeb4b6e719776048" exitCode=0 Apr 16 20:12:23.945290 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:23.945240 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5ft" event={"ID":"10f2b7ab-1884-4d1c-8207-ad7844c2b18f","Type":"ContainerDied","Data":"ff186bd789dc3527764755245f8e58461f0218c97be07454eeb4b6e719776048"} Apr 16 20:12:23.958479 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:23.958460 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:12:23.973362 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:23.973324 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" podStartSLOduration=8.945787905 podStartE2EDuration="26.973311038s" podCreationTimestamp="2026-04-16 20:11:57 +0000 UTC" firstStartedPulling="2026-04-16 20:12:00.328067125 +0000 UTC m=+3.168967499" lastFinishedPulling="2026-04-16 20:12:18.355590264 +0000 UTC m=+21.196490632" observedRunningTime="2026-04-16 20:12:23.972063446 +0000 UTC m=+26.812963837" watchObservedRunningTime="2026-04-16 20:12:23.973311038 +0000 UTC m=+26.814211428" Apr 16 20:12:24.758602 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:24.758411 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:24.758917 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:24.758730 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:24.949080 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:24.949047 2572 generic.go:358] "Generic (PLEG): container finished" podID="10f2b7ab-1884-4d1c-8207-ad7844c2b18f" containerID="6586e7a6530880454da8d43d3aac416a6f4f831e1aeb6f639db2522111ff80ff" exitCode=0 Apr 16 20:12:24.949236 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:24.949146 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5ft" event={"ID":"10f2b7ab-1884-4d1c-8207-ad7844c2b18f","Type":"ContainerDied","Data":"6586e7a6530880454da8d43d3aac416a6f4f831e1aeb6f639db2522111ff80ff"} Apr 16 20:12:24.950667 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:24.949893 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:12:24.965252 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:24.965230 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:12:25.128244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:25.128178 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-622d4"] Apr 16 20:12:25.128442 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:25.128302 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:25.128442 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:25.128424 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:25.128930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:25.128906 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2xqx8"] Apr 16 20:12:25.129026 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:25.129011 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:25.129325 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:25.129150 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:25.953487 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:25.953456 2572 generic.go:358] "Generic (PLEG): container finished" podID="10f2b7ab-1884-4d1c-8207-ad7844c2b18f" containerID="4991365a76ce521f6bdafa51df530222b27b1bc6677ee1faae3aa0b0acbe9a11" exitCode=0 Apr 16 20:12:25.953988 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:25.953542 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5ft" event={"ID":"10f2b7ab-1884-4d1c-8207-ad7844c2b18f","Type":"ContainerDied","Data":"4991365a76ce521f6bdafa51df530222b27b1bc6677ee1faae3aa0b0acbe9a11"} Apr 16 20:12:26.758648 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:26.758617 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:26.758648 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:26.758638 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:26.758885 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:26.758727 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:26.758885 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:26.758870 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:28.758920 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:28.758886 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:28.759317 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:28.758886 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:28.759317 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:28.759032 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:28.759317 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:28.759061 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:30.759144 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:30.759117 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:30.759144 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:30.759135 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:30.759569 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:30.759230 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xqx8" podUID="5ea99809-04f6-4ff1-adef-7bf9eb98c772" Apr 16 20:12:30.759569 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:30.759401 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:12:30.939335 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:30.939304 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-142.ec2.internal" event="NodeReady" Apr 16 20:12:30.939529 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:30.939493 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:12:30.987502 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:30.987471 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7f5fs"] Apr 16 20:12:31.022009 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.021940 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2pcp4"] Apr 16 20:12:31.022150 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.022091 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:31.024647 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.024619 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:12:31.024783 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.024682 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dstbq\"" Apr 16 20:12:31.024783 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.024624 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:12:31.044842 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.044786 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7f5fs"] Apr 16 20:12:31.044842 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.044815 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2pcp4"] Apr 16 20:12:31.045015 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.044908 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:12:31.047771 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.047552 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:12:31.047771 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.047625 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:12:31.047771 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.047664 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jqqmg\"" Apr 16 20:12:31.047771 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.047744 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:12:31.174212 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.174174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26bhg\" (UniqueName: \"kubernetes.io/projected/91eb91d1-3690-4158-98a2-3eecf9955cda-kube-api-access-26bhg\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:31.174212 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.174215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91eb91d1-3690-4158-98a2-3eecf9955cda-config-volume\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:31.174491 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.174291 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:31.174491 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.174318 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91eb91d1-3690-4158-98a2-3eecf9955cda-tmp-dir\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:31.174491 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.174345 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:12:31.174491 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.174368 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bqm8\" (UniqueName: \"kubernetes.io/projected/efcf8c22-a25f-4709-a840-c85cec57a1b9-kube-api-access-7bqm8\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:12:31.275140 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.275062 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:31.275140 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.275132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91eb91d1-3690-4158-98a2-3eecf9955cda-tmp-dir\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:31.275416 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.275159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:12:31.275416 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.275175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bqm8\" (UniqueName: \"kubernetes.io/projected/efcf8c22-a25f-4709-a840-c85cec57a1b9-kube-api-access-7bqm8\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:12:31.275416 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.275207 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26bhg\" (UniqueName: \"kubernetes.io/projected/91eb91d1-3690-4158-98a2-3eecf9955cda-kube-api-access-26bhg\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:31.275416 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.275233 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91eb91d1-3690-4158-98a2-3eecf9955cda-config-volume\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:31.275631 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.275592 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91eb91d1-3690-4158-98a2-3eecf9955cda-tmp-dir\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:31.275631 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.275206 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:31.275728 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.275673 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls podName:91eb91d1-3690-4158-98a2-3eecf9955cda nodeName:}" failed. No retries permitted until 2026-04-16 20:12:31.77565522 +0000 UTC m=+34.616555587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls") pod "dns-default-7f5fs" (UID: "91eb91d1-3690-4158-98a2-3eecf9955cda") : secret "dns-default-metrics-tls" not found Apr 16 20:12:31.275790 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.275243 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:31.275790 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.275783 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert podName:efcf8c22-a25f-4709-a840-c85cec57a1b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:31.775771103 +0000 UTC m=+34.616671474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert") pod "ingress-canary-2pcp4" (UID: "efcf8c22-a25f-4709-a840-c85cec57a1b9") : secret "canary-serving-cert" not found Apr 16 20:12:31.283203 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.283179 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91eb91d1-3690-4158-98a2-3eecf9955cda-config-volume\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:31.287681 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.287657 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26bhg\" (UniqueName: \"kubernetes.io/projected/91eb91d1-3690-4158-98a2-3eecf9955cda-kube-api-access-26bhg\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:31.287865 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.287842 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bqm8\" (UniqueName: \"kubernetes.io/projected/efcf8c22-a25f-4709-a840-c85cec57a1b9-kube-api-access-7bqm8\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:12:31.376198 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.376170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9clf\" (UniqueName: \"kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf\") pod \"network-check-target-2xqx8\" (UID: \"5ea99809-04f6-4ff1-adef-7bf9eb98c772\") " pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:31.376403 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.376244 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:31.376403 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.376336 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:31.376403 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.376365 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:31.376403 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.376399 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h9clf for pod openshift-network-diagnostics/network-check-target-2xqx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:31.376622 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.376338 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:31.376622 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.376446 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf podName:5ea99809-04f6-4ff1-adef-7bf9eb98c772 nodeName:}" failed. No retries permitted until 2026-04-16 20:13:03.376432341 +0000 UTC m=+66.217332708 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-h9clf" (UniqueName: "kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf") pod "network-check-target-2xqx8" (UID: "5ea99809-04f6-4ff1-adef-7bf9eb98c772") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:31.376622 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.376500 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs podName:d285ba82-dded-4707-87cb-35b755280286 nodeName:}" failed. No retries permitted until 2026-04-16 20:13:03.376484381 +0000 UTC m=+66.217384752 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs") pod "network-metrics-daemon-622d4" (UID: "d285ba82-dded-4707-87cb-35b755280286") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:31.780587 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.780555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:31.780587 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:31.780587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:12:31.781110 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.780689 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:31.781110 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.780695 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:31.781110 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.780737 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert podName:efcf8c22-a25f-4709-a840-c85cec57a1b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:32.780723265 +0000 UTC m=+35.621623632 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert") pod "ingress-canary-2pcp4" (UID: "efcf8c22-a25f-4709-a840-c85cec57a1b9") : secret "canary-serving-cert" not found Apr 16 20:12:31.781110 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:31.780764 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls podName:91eb91d1-3690-4158-98a2-3eecf9955cda nodeName:}" failed. No retries permitted until 2026-04-16 20:12:32.780751055 +0000 UTC m=+35.621651422 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls") pod "dns-default-7f5fs" (UID: "91eb91d1-3690-4158-98a2-3eecf9955cda") : secret "dns-default-metrics-tls" not found Apr 16 20:12:32.758619 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:32.758586 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:12:32.758798 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:32.758587 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:12:32.761356 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:32.761339 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:12:32.761542 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:32.761530 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q2prt\"" Apr 16 20:12:32.762516 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:32.762499 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:12:32.762516 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:32.762508 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8rq6t\"" Apr 16 20:12:32.762632 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:32.762599 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:12:32.787529 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:32.787504 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:32.787914 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:32.787534 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:12:32.787914 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:32.787611 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:32.787914 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:32.787644 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:32.787914 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:32.787682 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert podName:efcf8c22-a25f-4709-a840-c85cec57a1b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:34.787668064 +0000 UTC m=+37.628568432 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert") pod "ingress-canary-2pcp4" (UID: "efcf8c22-a25f-4709-a840-c85cec57a1b9") : secret "canary-serving-cert" not found Apr 16 20:12:32.787914 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:32.787708 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls podName:91eb91d1-3690-4158-98a2-3eecf9955cda nodeName:}" failed. No retries permitted until 2026-04-16 20:12:34.787687214 +0000 UTC m=+37.628587588 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls") pod "dns-default-7f5fs" (UID: "91eb91d1-3690-4158-98a2-3eecf9955cda") : secret "dns-default-metrics-tls" not found Apr 16 20:12:32.970016 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:32.969974 2572 generic.go:358] "Generic (PLEG): container finished" podID="10f2b7ab-1884-4d1c-8207-ad7844c2b18f" containerID="b5d3de12429ae8487f91c6b52e006f0c7c45e6989d143c9c122516974bb3ef0c" exitCode=0 Apr 16 20:12:32.970168 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:32.970031 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5ft" event={"ID":"10f2b7ab-1884-4d1c-8207-ad7844c2b18f","Type":"ContainerDied","Data":"b5d3de12429ae8487f91c6b52e006f0c7c45e6989d143c9c122516974bb3ef0c"} Apr 16 20:12:33.974585 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:33.974555 2572 generic.go:358] "Generic (PLEG): container finished" podID="10f2b7ab-1884-4d1c-8207-ad7844c2b18f" containerID="8059c1615245ce82ea2983d9bc38153eb7c2c25991c017ecfa94c59b8d40f6f7" exitCode=0 Apr 16 20:12:33.974585 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:33.974591 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5ft" event={"ID":"10f2b7ab-1884-4d1c-8207-ad7844c2b18f","Type":"ContainerDied","Data":"8059c1615245ce82ea2983d9bc38153eb7c2c25991c017ecfa94c59b8d40f6f7"} Apr 16 20:12:34.803202 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:34.803168 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:34.803202 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:34.803203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:12:34.803451 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:34.803301 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:34.803451 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:34.803318 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:34.803451 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:34.803349 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert podName:efcf8c22-a25f-4709-a840-c85cec57a1b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:38.803336304 +0000 UTC m=+41.644236672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert") pod "ingress-canary-2pcp4" (UID: "efcf8c22-a25f-4709-a840-c85cec57a1b9") : secret "canary-serving-cert" not found Apr 16 20:12:34.803451 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:34.803398 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls podName:91eb91d1-3690-4158-98a2-3eecf9955cda nodeName:}" failed. No retries permitted until 2026-04-16 20:12:38.803359404 +0000 UTC m=+41.644259777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls") pod "dns-default-7f5fs" (UID: "91eb91d1-3690-4158-98a2-3eecf9955cda") : secret "dns-default-metrics-tls" not found Apr 16 20:12:34.980613 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:34.980582 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5ft" event={"ID":"10f2b7ab-1884-4d1c-8207-ad7844c2b18f","Type":"ContainerStarted","Data":"dad6f2ec9ddda1ba93b7092304ddbf684304931e654a126a2ecaceb9cb8a126e"} Apr 16 20:12:35.004598 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:35.004556 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zq5ft" podStartSLOduration=6.463202086 podStartE2EDuration="38.004542219s" podCreationTimestamp="2026-04-16 20:11:57 +0000 UTC" firstStartedPulling="2026-04-16 20:12:00.336546097 +0000 UTC m=+3.177446467" lastFinishedPulling="2026-04-16 20:12:31.877886219 +0000 UTC m=+34.718786600" observedRunningTime="2026-04-16 20:12:35.003440977 +0000 UTC m=+37.844341368" watchObservedRunningTime="2026-04-16 20:12:35.004542219 +0000 UTC m=+37.845442607" Apr 16 20:12:38.830780 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:38.830588 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:38.830780 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:38.830787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:12:38.831303 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:38.830735 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:38.831303 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:38.830890 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:38.831303 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:38.830914 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls podName:91eb91d1-3690-4158-98a2-3eecf9955cda nodeName:}" failed. No retries permitted until 2026-04-16 20:12:46.830891422 +0000 UTC m=+49.671791801 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls") pod "dns-default-7f5fs" (UID: "91eb91d1-3690-4158-98a2-3eecf9955cda") : secret "dns-default-metrics-tls" not found Apr 16 20:12:38.831303 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:38.830934 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert podName:efcf8c22-a25f-4709-a840-c85cec57a1b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:46.830923184 +0000 UTC m=+49.671823555 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert") pod "ingress-canary-2pcp4" (UID: "efcf8c22-a25f-4709-a840-c85cec57a1b9") : secret "canary-serving-cert" not found Apr 16 20:12:46.886413 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:46.886358 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:12:46.886413 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:46.886414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:12:46.886818 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:46.886509 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:46.886818 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:46.886509 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:46.886818 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:46.886571 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert podName:efcf8c22-a25f-4709-a840-c85cec57a1b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:13:02.886555857 +0000 UTC m=+65.727456225 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert") pod "ingress-canary-2pcp4" (UID: "efcf8c22-a25f-4709-a840-c85cec57a1b9") : secret "canary-serving-cert" not found Apr 16 20:12:46.886818 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:12:46.886583 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls podName:91eb91d1-3690-4158-98a2-3eecf9955cda nodeName:}" failed. No retries permitted until 2026-04-16 20:13:02.886577928 +0000 UTC m=+65.727478295 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls") pod "dns-default-7f5fs" (UID: "91eb91d1-3690-4158-98a2-3eecf9955cda") : secret "dns-default-metrics-tls" not found Apr 16 20:12:56.967388 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:12:56.967349 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgjgp" Apr 16 20:13:02.895577 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:02.895540 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:13:02.895577 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:02.895580 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:13:02.896131 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:13:02.895688 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:13:02.896131 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:13:02.895705 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:13:02.896131 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:13:02.895753 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert podName:efcf8c22-a25f-4709-a840-c85cec57a1b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:13:34.895739475 +0000 UTC m=+97.736639843 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert") pod "ingress-canary-2pcp4" (UID: "efcf8c22-a25f-4709-a840-c85cec57a1b9") : secret "canary-serving-cert" not found Apr 16 20:13:02.896131 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:13:02.895764 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls podName:91eb91d1-3690-4158-98a2-3eecf9955cda nodeName:}" failed. No retries permitted until 2026-04-16 20:13:34.895758731 +0000 UTC m=+97.736659099 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls") pod "dns-default-7f5fs" (UID: "91eb91d1-3690-4158-98a2-3eecf9955cda") : secret "dns-default-metrics-tls" not found Apr 16 20:13:03.398690 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:03.398651 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9clf\" (UniqueName: \"kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf\") pod \"network-check-target-2xqx8\" (UID: \"5ea99809-04f6-4ff1-adef-7bf9eb98c772\") " pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:13:03.398879 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:03.398725 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:13:03.401246 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:03.401228 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:13:03.401306 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:03.401261 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:13:03.409313 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:13:03.409292 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:13:03.409430 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:13:03.409362 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs podName:d285ba82-dded-4707-87cb-35b755280286 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:07.40934162 +0000 UTC m=+130.250241999 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs") pod "network-metrics-daemon-622d4" (UID: "d285ba82-dded-4707-87cb-35b755280286") : secret "metrics-daemon-secret" not found Apr 16 20:13:03.411837 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:03.411822 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:13:03.423437 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:03.423413 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9clf\" (UniqueName: \"kubernetes.io/projected/5ea99809-04f6-4ff1-adef-7bf9eb98c772-kube-api-access-h9clf\") pod \"network-check-target-2xqx8\" (UID: \"5ea99809-04f6-4ff1-adef-7bf9eb98c772\") " pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:13:03.671830 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:03.671755 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8rq6t\"" Apr 16 20:13:03.678975 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:03.678954 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:13:03.870124 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:03.870086 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2xqx8"] Apr 16 20:13:03.875087 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:13:03.875058 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ea99809_04f6_4ff1_adef_7bf9eb98c772.slice/crio-cf8412962cdd9ce2d264e9922ae55b7fb43cd833845b18f0a873b1b7254e6afe WatchSource:0}: Error finding container cf8412962cdd9ce2d264e9922ae55b7fb43cd833845b18f0a873b1b7254e6afe: Status 404 returned error can't find the container with id cf8412962cdd9ce2d264e9922ae55b7fb43cd833845b18f0a873b1b7254e6afe Apr 16 20:13:04.034749 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:04.034712 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2xqx8" event={"ID":"5ea99809-04f6-4ff1-adef-7bf9eb98c772","Type":"ContainerStarted","Data":"cf8412962cdd9ce2d264e9922ae55b7fb43cd833845b18f0a873b1b7254e6afe"} Apr 16 20:13:07.042371 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:07.042337 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2xqx8" event={"ID":"5ea99809-04f6-4ff1-adef-7bf9eb98c772","Type":"ContainerStarted","Data":"8a3d6554fa1a3bb19ecd50d85d529ce9bc11d8e01afab7b7802f38acf1108121"} Apr 16 20:13:07.042803 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:07.042491 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:13:07.059457 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:07.059408 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2xqx8" podStartSLOduration=67.383594311 podStartE2EDuration="1m10.05939616s" podCreationTimestamp="2026-04-16 20:11:57 +0000 UTC" firstStartedPulling="2026-04-16 20:13:03.876905539 +0000 UTC m=+66.717805907" lastFinishedPulling="2026-04-16 20:13:06.552707387 +0000 UTC m=+69.393607756" observedRunningTime="2026-04-16 20:13:07.058603222 +0000 UTC m=+69.899503612" watchObservedRunningTime="2026-04-16 20:13:07.05939616 +0000 UTC m=+69.900296540" Apr 16 20:13:34.906654 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:34.906618 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:13:34.906654 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:34.906661 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:13:34.907125 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:13:34.906759 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:13:34.907125 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:13:34.906763 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:13:34.907125 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:13:34.906820 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert podName:efcf8c22-a25f-4709-a840-c85cec57a1b9 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:38.906806963 +0000 UTC m=+161.747707335 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert") pod "ingress-canary-2pcp4" (UID: "efcf8c22-a25f-4709-a840-c85cec57a1b9") : secret "canary-serving-cert" not found Apr 16 20:13:34.907125 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:13:34.906832 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls podName:91eb91d1-3690-4158-98a2-3eecf9955cda nodeName:}" failed. No retries permitted until 2026-04-16 20:14:38.906826783 +0000 UTC m=+161.747727154 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls") pod "dns-default-7f5fs" (UID: "91eb91d1-3690-4158-98a2-3eecf9955cda") : secret "dns-default-metrics-tls" not found Apr 16 20:13:38.046696 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:13:38.046665 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2xqx8" Apr 16 20:14:07.429347 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:07.429288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:14:07.429853 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:07.429457 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:14:07.429853 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:07.429534 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs podName:d285ba82-dded-4707-87cb-35b755280286 nodeName:}" failed. No retries permitted until 2026-04-16 20:16:09.429516679 +0000 UTC m=+252.270417047 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs") pod "network-metrics-daemon-622d4" (UID: "d285ba82-dded-4707-87cb-35b755280286") : secret "metrics-daemon-secret" not found Apr 16 20:14:12.422270 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.422231 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr"] Apr 16 20:14:12.424185 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.424169 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" Apr 16 20:14:12.427594 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.427571 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 20:14:12.427826 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.427809 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 20:14:12.428637 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.428617 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-hpjbd\"" Apr 16 20:14:12.428762 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.428665 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:14:12.429673 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.429649 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-j4pl4"] Apr 16 20:14:12.431428 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.431414 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.433693 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.433675 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 20:14:12.433786 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.433712 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-ctxmz\"" Apr 16 20:14:12.434236 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.434221 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:14:12.434442 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.434426 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 20:14:12.435089 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.435069 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 20:14:12.440577 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.440558 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr"] Apr 16 20:14:12.460468 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.460438 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-j4pl4"] Apr 16 20:14:12.464398 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.462303 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 20:14:12.527997 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.527968 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77"] Apr 16 20:14:12.530350 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.530332 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" Apr 16 20:14:12.567723 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.567694 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4t8vr\" (UID: \"d45752f8-dcbb-41c1-9b95-e5c8562cd79a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" Apr 16 20:14:12.567853 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.567728 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80a9a559-faef-43f2-ae15-5d0a784691b5-config\") pod \"console-operator-9d4b6777b-j4pl4\" (UID: \"80a9a559-faef-43f2-ae15-5d0a784691b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.567853 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.567748 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80a9a559-faef-43f2-ae15-5d0a784691b5-trusted-ca\") pod \"console-operator-9d4b6777b-j4pl4\" (UID: \"80a9a559-faef-43f2-ae15-5d0a784691b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.567853 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.567830 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk5zv\" (UniqueName: \"kubernetes.io/projected/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-kube-api-access-xk5zv\") pod \"cluster-samples-operator-6dc5bdb6b4-4t8vr\" (UID: \"d45752f8-dcbb-41c1-9b95-e5c8562cd79a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" Apr 16 20:14:12.567853 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.567850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcmjv\" (UniqueName: \"kubernetes.io/projected/80a9a559-faef-43f2-ae15-5d0a784691b5-kube-api-access-wcmjv\") pod \"console-operator-9d4b6777b-j4pl4\" (UID: \"80a9a559-faef-43f2-ae15-5d0a784691b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.567990 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.567867 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80a9a559-faef-43f2-ae15-5d0a784691b5-serving-cert\") pod \"console-operator-9d4b6777b-j4pl4\" (UID: \"80a9a559-faef-43f2-ae15-5d0a784691b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.586064 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.586035 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 20:14:12.593786 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.593763 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-mxwph\"" Apr 16 20:14:12.604186 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.604165 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 20:14:12.604396 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.604361 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:14:12.604457 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.604439 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 20:14:12.608565 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.608546 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77"] Apr 16 20:14:12.668256 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.668226 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-t2b77\" (UID: \"ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" Apr 16 20:14:12.668411 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.668263 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e-config\") pod \"service-ca-operator-d6fc45fc5-t2b77\" (UID: \"ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" Apr 16 20:14:12.668411 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.668292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xk5zv\" (UniqueName: \"kubernetes.io/projected/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-kube-api-access-xk5zv\") pod \"cluster-samples-operator-6dc5bdb6b4-4t8vr\" (UID: \"d45752f8-dcbb-41c1-9b95-e5c8562cd79a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" Apr 16 20:14:12.668411 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.668318 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcmjv\" (UniqueName: \"kubernetes.io/projected/80a9a559-faef-43f2-ae15-5d0a784691b5-kube-api-access-wcmjv\") pod \"console-operator-9d4b6777b-j4pl4\" (UID: \"80a9a559-faef-43f2-ae15-5d0a784691b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.668411 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.668346 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80a9a559-faef-43f2-ae15-5d0a784691b5-serving-cert\") pod \"console-operator-9d4b6777b-j4pl4\" (UID: \"80a9a559-faef-43f2-ae15-5d0a784691b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.668579 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.668553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4t8vr\" (UID: \"d45752f8-dcbb-41c1-9b95-e5c8562cd79a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" Apr 16 20:14:12.668629 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.668587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80a9a559-faef-43f2-ae15-5d0a784691b5-config\") pod \"console-operator-9d4b6777b-j4pl4\" (UID: \"80a9a559-faef-43f2-ae15-5d0a784691b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.668629 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.668613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80a9a559-faef-43f2-ae15-5d0a784691b5-trusted-ca\") pod \"console-operator-9d4b6777b-j4pl4\" (UID: \"80a9a559-faef-43f2-ae15-5d0a784691b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.668724 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.668677 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m49pp\" (UniqueName: \"kubernetes.io/projected/ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e-kube-api-access-m49pp\") pod \"service-ca-operator-d6fc45fc5-t2b77\" (UID: \"ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" Apr 16 20:14:12.668724 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:12.668692 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:14:12.668821 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:12.668757 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls podName:d45752f8-dcbb-41c1-9b95-e5c8562cd79a nodeName:}" failed. No retries permitted until 2026-04-16 20:14:13.16873726 +0000 UTC m=+136.009637648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4t8vr" (UID: "d45752f8-dcbb-41c1-9b95-e5c8562cd79a") : secret "samples-operator-tls" not found Apr 16 20:14:12.669234 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.669213 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80a9a559-faef-43f2-ae15-5d0a784691b5-config\") pod \"console-operator-9d4b6777b-j4pl4\" (UID: \"80a9a559-faef-43f2-ae15-5d0a784691b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.669405 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.669363 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80a9a559-faef-43f2-ae15-5d0a784691b5-trusted-ca\") pod \"console-operator-9d4b6777b-j4pl4\" (UID: \"80a9a559-faef-43f2-ae15-5d0a784691b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.670886 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.670860 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80a9a559-faef-43f2-ae15-5d0a784691b5-serving-cert\") pod \"console-operator-9d4b6777b-j4pl4\" (UID: \"80a9a559-faef-43f2-ae15-5d0a784691b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.676724 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.676666 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk5zv\" (UniqueName: \"kubernetes.io/projected/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-kube-api-access-xk5zv\") pod \"cluster-samples-operator-6dc5bdb6b4-4t8vr\" (UID: \"d45752f8-dcbb-41c1-9b95-e5c8562cd79a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" Apr 16 20:14:12.678223 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.678208 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcmjv\" (UniqueName: \"kubernetes.io/projected/80a9a559-faef-43f2-ae15-5d0a784691b5-kube-api-access-wcmjv\") pod \"console-operator-9d4b6777b-j4pl4\" (UID: \"80a9a559-faef-43f2-ae15-5d0a784691b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.740393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.740346 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:12.769686 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.769658 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m49pp\" (UniqueName: \"kubernetes.io/projected/ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e-kube-api-access-m49pp\") pod \"service-ca-operator-d6fc45fc5-t2b77\" (UID: \"ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" Apr 16 20:14:12.769812 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.769698 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-t2b77\" (UID: \"ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" Apr 16 20:14:12.769812 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.769720 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e-config\") pod \"service-ca-operator-d6fc45fc5-t2b77\" (UID: \"ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" Apr 16 20:14:12.770218 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.770201 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e-config\") pod \"service-ca-operator-d6fc45fc5-t2b77\" (UID: \"ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" Apr 16 20:14:12.772083 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.772062 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-t2b77\" (UID: \"ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" Apr 16 20:14:12.779386 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.779353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m49pp\" (UniqueName: \"kubernetes.io/projected/ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e-kube-api-access-m49pp\") pod \"service-ca-operator-d6fc45fc5-t2b77\" (UID: \"ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" Apr 16 20:14:12.839840 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.839813 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" Apr 16 20:14:12.852095 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.852070 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-j4pl4"] Apr 16 20:14:12.855020 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:14:12.854995 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80a9a559_faef_43f2_ae15_5d0a784691b5.slice/crio-458db1fc93e730513fd6ec3402ce2e6d233fcf339b068497b5b0f804cdca3569 WatchSource:0}: Error finding container 458db1fc93e730513fd6ec3402ce2e6d233fcf339b068497b5b0f804cdca3569: Status 404 returned error can't find the container with id 458db1fc93e730513fd6ec3402ce2e6d233fcf339b068497b5b0f804cdca3569 Apr 16 20:14:12.961277 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:12.961250 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77"] Apr 16 20:14:12.964292 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:14:12.964265 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae3fadf5_e6f2_4567_ac33_4eb8cbd2ef9e.slice/crio-cb16f6682388e30b1cc22d33314bc7a19fcacdff3b002314f0c30d20ae3d6b3c WatchSource:0}: Error finding container cb16f6682388e30b1cc22d33314bc7a19fcacdff3b002314f0c30d20ae3d6b3c: Status 404 returned error can't find the container with id cb16f6682388e30b1cc22d33314bc7a19fcacdff3b002314f0c30d20ae3d6b3c Apr 16 20:14:13.166176 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:13.166139 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" event={"ID":"ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e","Type":"ContainerStarted","Data":"cb16f6682388e30b1cc22d33314bc7a19fcacdff3b002314f0c30d20ae3d6b3c"} Apr 16 20:14:13.166929 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:13.166904 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" event={"ID":"80a9a559-faef-43f2-ae15-5d0a784691b5","Type":"ContainerStarted","Data":"458db1fc93e730513fd6ec3402ce2e6d233fcf339b068497b5b0f804cdca3569"} Apr 16 20:14:13.174292 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:13.174275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4t8vr\" (UID: \"d45752f8-dcbb-41c1-9b95-e5c8562cd79a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" Apr 16 20:14:13.174412 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:13.174394 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:14:13.174456 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:13.174451 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls podName:d45752f8-dcbb-41c1-9b95-e5c8562cd79a nodeName:}" failed. No retries permitted until 2026-04-16 20:14:14.174438483 +0000 UTC m=+137.015338852 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4t8vr" (UID: "d45752f8-dcbb-41c1-9b95-e5c8562cd79a") : secret "samples-operator-tls" not found Apr 16 20:14:14.181629 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:14.181593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4t8vr\" (UID: \"d45752f8-dcbb-41c1-9b95-e5c8562cd79a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" Apr 16 20:14:14.182074 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:14.181724 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:14:14.182074 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:14.181801 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls podName:d45752f8-dcbb-41c1-9b95-e5c8562cd79a nodeName:}" failed. No retries permitted until 2026-04-16 20:14:16.181781668 +0000 UTC m=+139.022682035 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4t8vr" (UID: "d45752f8-dcbb-41c1-9b95-e5c8562cd79a") : secret "samples-operator-tls" not found Apr 16 20:14:16.172725 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.172694 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/0.log" Apr 16 20:14:16.173152 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.172734 2572 generic.go:358] "Generic (PLEG): container finished" podID="80a9a559-faef-43f2-ae15-5d0a784691b5" containerID="a8cd145b33bfde1d9b51cc1036b5afbd092b80376309962cf62bd59087a1e54b" exitCode=255 Apr 16 20:14:16.173152 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.172812 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" event={"ID":"80a9a559-faef-43f2-ae15-5d0a784691b5","Type":"ContainerDied","Data":"a8cd145b33bfde1d9b51cc1036b5afbd092b80376309962cf62bd59087a1e54b"} Apr 16 20:14:16.173152 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.173059 2572 scope.go:117] "RemoveContainer" containerID="a8cd145b33bfde1d9b51cc1036b5afbd092b80376309962cf62bd59087a1e54b" Apr 16 20:14:16.174150 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.174121 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" event={"ID":"ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e","Type":"ContainerStarted","Data":"1aeb307a2048ac34d975df222c47d9169a442fb7c360752973e554aea134e227"} Apr 16 20:14:16.196747 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.196687 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4t8vr\" (UID: \"d45752f8-dcbb-41c1-9b95-e5c8562cd79a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" Apr 16 20:14:16.196872 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:16.196852 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:14:16.196944 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:16.196934 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls podName:d45752f8-dcbb-41c1-9b95-e5c8562cd79a nodeName:}" failed. No retries permitted until 2026-04-16 20:14:20.196917719 +0000 UTC m=+143.037818090 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4t8vr" (UID: "d45752f8-dcbb-41c1-9b95-e5c8562cd79a") : secret "samples-operator-tls" not found Apr 16 20:14:16.229015 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.228977 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" podStartSLOduration=2.067090422 podStartE2EDuration="4.228962163s" podCreationTimestamp="2026-04-16 20:14:12 +0000 UTC" firstStartedPulling="2026-04-16 20:14:12.966031258 +0000 UTC m=+135.806931626" lastFinishedPulling="2026-04-16 20:14:15.127902997 +0000 UTC m=+137.968803367" observedRunningTime="2026-04-16 20:14:16.228533098 +0000 UTC m=+139.069433488" watchObservedRunningTime="2026-04-16 20:14:16.228962163 +0000 UTC m=+139.069862552" Apr 16 20:14:16.286532 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.286505 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-k89vg"] Apr 16 20:14:16.288291 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.288272 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-k89vg" Apr 16 20:14:16.290765 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.290748 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 20:14:16.290921 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.290907 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 20:14:16.291100 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.291083 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-9ctrn\"" Apr 16 20:14:16.301146 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.301118 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-k89vg"] Apr 16 20:14:16.398648 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.398618 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ft2l\" (UniqueName: \"kubernetes.io/projected/4f6d31f0-9d73-4517-8bee-56833892973e-kube-api-access-7ft2l\") pod \"migrator-74bb7799d9-k89vg\" (UID: \"4f6d31f0-9d73-4517-8bee-56833892973e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-k89vg" Apr 16 20:14:16.500023 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.499980 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ft2l\" (UniqueName: \"kubernetes.io/projected/4f6d31f0-9d73-4517-8bee-56833892973e-kube-api-access-7ft2l\") pod \"migrator-74bb7799d9-k89vg\" (UID: \"4f6d31f0-9d73-4517-8bee-56833892973e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-k89vg" Apr 16 20:14:16.508068 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.508046 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ft2l\" (UniqueName: \"kubernetes.io/projected/4f6d31f0-9d73-4517-8bee-56833892973e-kube-api-access-7ft2l\") pod \"migrator-74bb7799d9-k89vg\" (UID: \"4f6d31f0-9d73-4517-8bee-56833892973e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-k89vg" Apr 16 20:14:16.621769 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.621730 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-k89vg" Apr 16 20:14:16.736862 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:16.736833 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-k89vg"] Apr 16 20:14:16.739547 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:14:16.739519 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f6d31f0_9d73_4517_8bee_56833892973e.slice/crio-4b9d91f16f0030440ff09902cd7f424816f7e1c024471259991deee6845b378a WatchSource:0}: Error finding container 4b9d91f16f0030440ff09902cd7f424816f7e1c024471259991deee6845b378a: Status 404 returned error can't find the container with id 4b9d91f16f0030440ff09902cd7f424816f7e1c024471259991deee6845b378a Apr 16 20:14:17.177870 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:17.177831 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-k89vg" event={"ID":"4f6d31f0-9d73-4517-8bee-56833892973e","Type":"ContainerStarted","Data":"4b9d91f16f0030440ff09902cd7f424816f7e1c024471259991deee6845b378a"} Apr 16 20:14:17.179262 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:17.179236 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/1.log" Apr 16 20:14:17.179770 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:17.179743 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/0.log" Apr 16 20:14:17.179897 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:17.179780 2572 generic.go:358] "Generic (PLEG): container finished" podID="80a9a559-faef-43f2-ae15-5d0a784691b5" containerID="3e91ff99e3497a5eab7e7872f391b37992e0edb0e9324f46b762d6c709a9b111" exitCode=255 Apr 16 20:14:17.179897 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:17.179811 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" event={"ID":"80a9a559-faef-43f2-ae15-5d0a784691b5","Type":"ContainerDied","Data":"3e91ff99e3497a5eab7e7872f391b37992e0edb0e9324f46b762d6c709a9b111"} Apr 16 20:14:17.179897 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:17.179859 2572 scope.go:117] "RemoveContainer" containerID="a8cd145b33bfde1d9b51cc1036b5afbd092b80376309962cf62bd59087a1e54b" Apr 16 20:14:17.180395 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:17.180361 2572 scope.go:117] "RemoveContainer" containerID="3e91ff99e3497a5eab7e7872f391b37992e0edb0e9324f46b762d6c709a9b111" Apr 16 20:14:17.180601 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:17.180568 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-j4pl4_openshift-console-operator(80a9a559-faef-43f2-ae15-5d0a784691b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" podUID="80a9a559-faef-43f2-ae15-5d0a784691b5" Apr 16 20:14:18.183691 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:18.183602 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-k89vg" event={"ID":"4f6d31f0-9d73-4517-8bee-56833892973e","Type":"ContainerStarted","Data":"8b802c0c673b053f329b4c8f2d702d0b4150742e73e6f87f91d8e8627eb4785b"} Apr 16 20:14:18.183691 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:18.183641 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-k89vg" event={"ID":"4f6d31f0-9d73-4517-8bee-56833892973e","Type":"ContainerStarted","Data":"3cfd118adf18acd6d972c93dc416d3853a1ed19915a1e21ea8e16bb4b05044a5"} Apr 16 20:14:18.184929 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:18.184910 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/1.log" Apr 16 20:14:18.185201 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:18.185188 2572 scope.go:117] "RemoveContainer" containerID="3e91ff99e3497a5eab7e7872f391b37992e0edb0e9324f46b762d6c709a9b111" Apr 16 20:14:18.185343 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:18.185326 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-j4pl4_openshift-console-operator(80a9a559-faef-43f2-ae15-5d0a784691b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" podUID="80a9a559-faef-43f2-ae15-5d0a784691b5" Apr 16 20:14:18.200323 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:18.200284 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-k89vg" podStartSLOduration=1.170610762 podStartE2EDuration="2.200270314s" podCreationTimestamp="2026-04-16 20:14:16 +0000 UTC" firstStartedPulling="2026-04-16 20:14:16.741320629 +0000 UTC m=+139.582220997" lastFinishedPulling="2026-04-16 20:14:17.770980179 +0000 UTC m=+140.611880549" observedRunningTime="2026-04-16 20:14:18.199742536 +0000 UTC m=+141.040642926" watchObservedRunningTime="2026-04-16 20:14:18.200270314 +0000 UTC m=+141.041170705" Apr 16 20:14:19.224765 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:19.224738 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fjbmm_8ee16202-241d-45ac-9219-3363704a708e/dns-node-resolver/0.log" Apr 16 20:14:20.229394 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:20.229337 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4t8vr\" (UID: \"d45752f8-dcbb-41c1-9b95-e5c8562cd79a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" Apr 16 20:14:20.229759 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:20.229485 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:14:20.229759 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:20.229561 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls podName:d45752f8-dcbb-41c1-9b95-e5c8562cd79a nodeName:}" failed. No retries permitted until 2026-04-16 20:14:28.229545581 +0000 UTC m=+151.070445952 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4t8vr" (UID: "d45752f8-dcbb-41c1-9b95-e5c8562cd79a") : secret "samples-operator-tls" not found Apr 16 20:14:20.423523 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:20.423491 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-r8gdb_bfde86ea-03d1-4cf4-90b7-76b04a98def5/node-ca/0.log" Apr 16 20:14:21.424128 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:21.424093 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-k89vg_4f6d31f0-9d73-4517-8bee-56833892973e/migrator/0.log" Apr 16 20:14:21.637298 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:21.637265 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-k89vg_4f6d31f0-9d73-4517-8bee-56833892973e/graceful-termination/0.log" Apr 16 20:14:22.740488 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:22.740458 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:22.740488 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:22.740488 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:22.740888 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:22.740829 2572 scope.go:117] "RemoveContainer" containerID="3e91ff99e3497a5eab7e7872f391b37992e0edb0e9324f46b762d6c709a9b111" Apr 16 20:14:22.740993 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:22.740976 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-j4pl4_openshift-console-operator(80a9a559-faef-43f2-ae15-5d0a784691b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" podUID="80a9a559-faef-43f2-ae15-5d0a784691b5" Apr 16 20:14:28.289696 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:28.289658 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4t8vr\" (UID: \"d45752f8-dcbb-41c1-9b95-e5c8562cd79a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" Apr 16 20:14:28.292224 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:28.292202 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d45752f8-dcbb-41c1-9b95-e5c8562cd79a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4t8vr\" (UID: \"d45752f8-dcbb-41c1-9b95-e5c8562cd79a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" Apr 16 20:14:28.333320 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:28.333285 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" Apr 16 20:14:28.450581 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:28.450549 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr"] Apr 16 20:14:29.206257 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:29.206213 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" event={"ID":"d45752f8-dcbb-41c1-9b95-e5c8562cd79a","Type":"ContainerStarted","Data":"0fd7eda66a0ac1610a2140d72cf32cf1635972ccabedfd8937b1aa45700c5910"} Apr 16 20:14:30.210925 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:30.210891 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" event={"ID":"d45752f8-dcbb-41c1-9b95-e5c8562cd79a","Type":"ContainerStarted","Data":"a9942cc7d45b1110129ebcdf07fc68e22a9fdcb1b0a8c7a395eb34ebe0698ab2"} Apr 16 20:14:30.211362 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:30.210929 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" event={"ID":"d45752f8-dcbb-41c1-9b95-e5c8562cd79a","Type":"ContainerStarted","Data":"abc722ac3915157e821281f3b88e2f32e23e7ad39875789e3b4d401ea8ede59c"} Apr 16 20:14:34.032921 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:34.032879 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7f5fs" podUID="91eb91d1-3690-4158-98a2-3eecf9955cda" Apr 16 20:14:34.055217 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:34.055177 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2pcp4" podUID="efcf8c22-a25f-4709-a840-c85cec57a1b9" Apr 16 20:14:34.219954 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:34.219925 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7f5fs" Apr 16 20:14:34.219954 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:34.219961 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:14:34.758827 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:34.758800 2572 scope.go:117] "RemoveContainer" containerID="3e91ff99e3497a5eab7e7872f391b37992e0edb0e9324f46b762d6c709a9b111" Apr 16 20:14:35.223513 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:35.223486 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:14:35.223886 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:35.223843 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/1.log" Apr 16 20:14:35.223886 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:35.223874 2572 generic.go:358] "Generic (PLEG): container finished" podID="80a9a559-faef-43f2-ae15-5d0a784691b5" containerID="4556f4e0d733068baa40bd6dad23be734274a6c5ed5e53b0c8f36cdda531a10b" exitCode=255 Apr 16 20:14:35.223972 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:35.223934 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" event={"ID":"80a9a559-faef-43f2-ae15-5d0a784691b5","Type":"ContainerDied","Data":"4556f4e0d733068baa40bd6dad23be734274a6c5ed5e53b0c8f36cdda531a10b"} Apr 16 20:14:35.223972 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:35.223967 2572 scope.go:117] "RemoveContainer" containerID="3e91ff99e3497a5eab7e7872f391b37992e0edb0e9324f46b762d6c709a9b111" Apr 16 20:14:35.224285 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:35.224266 2572 scope.go:117] "RemoveContainer" containerID="4556f4e0d733068baa40bd6dad23be734274a6c5ed5e53b0c8f36cdda531a10b" Apr 16 20:14:35.224501 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:35.224477 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-j4pl4_openshift-console-operator(80a9a559-faef-43f2-ae15-5d0a784691b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" podUID="80a9a559-faef-43f2-ae15-5d0a784691b5" Apr 16 20:14:35.242556 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:35.242503 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4t8vr" podStartSLOduration=21.666155395 podStartE2EDuration="23.242487126s" podCreationTimestamp="2026-04-16 20:14:12 +0000 UTC" firstStartedPulling="2026-04-16 20:14:28.491703939 +0000 UTC m=+151.332604308" lastFinishedPulling="2026-04-16 20:14:30.06803567 +0000 UTC m=+152.908936039" observedRunningTime="2026-04-16 20:14:30.243048019 +0000 UTC m=+153.083948402" watchObservedRunningTime="2026-04-16 20:14:35.242487126 +0000 UTC m=+158.083387515" Apr 16 20:14:35.772718 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:35.772660 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-622d4" podUID="d285ba82-dded-4707-87cb-35b755280286" Apr 16 20:14:36.228418 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:36.228371 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:14:38.306896 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.306857 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5rcs6"] Apr 16 20:14:38.309406 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.309369 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:38.320578 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:38.320547 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"insights-runtime-extractor-tls\" is forbidden: User \"system:node:ip-10-0-137-142.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-137-142.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" type="*v1.Secret" Apr 16 20:14:38.320686 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:38.320603 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:ip-10-0-137-142.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-137-142.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" type="*v1.ConfigMap" Apr 16 20:14:38.322023 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:38.322001 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-137-142.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-137-142.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 16 20:14:38.326409 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.326344 2572 status_manager.go:895] "Failed to get status for pod" podUID="c31958fa-caf8-4c77-a312-e2d0f8238e6f" pod="openshift-insights/insights-runtime-extractor-5rcs6" err="pods \"insights-runtime-extractor-5rcs6\" is forbidden: User \"system:node:ip-10-0-137-142.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-137-142.ec2.internal' and this object" Apr 16 20:14:38.326763 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:38.326740 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"insights-runtime-extractor-sa-dockercfg-zvwln\" is forbidden: User \"system:node:ip-10-0-137-142.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-137-142.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zvwln\"" type="*v1.Secret" Apr 16 20:14:38.326821 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:38.326795 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-137-142.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-137-142.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 16 20:14:38.354617 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.354583 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5rcs6"] Apr 16 20:14:38.462812 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.462771 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c31958fa-caf8-4c77-a312-e2d0f8238e6f-crio-socket\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:38.463006 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.462819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c31958fa-caf8-4c77-a312-e2d0f8238e6f-data-volume\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:38.463006 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.462848 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c31958fa-caf8-4c77-a312-e2d0f8238e6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:38.463006 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.462889 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcf8j\" (UniqueName: \"kubernetes.io/projected/c31958fa-caf8-4c77-a312-e2d0f8238e6f-kube-api-access-kcf8j\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:38.463006 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.462969 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c31958fa-caf8-4c77-a312-e2d0f8238e6f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:38.491486 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.491452 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7f6d57ffbc-fxbrb"] Apr 16 20:14:38.493414 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.493399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.497733 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.497715 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 20:14:38.498689 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.498667 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kmdps\"" Apr 16 20:14:38.498796 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.498693 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 20:14:38.498796 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.498751 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 20:14:38.502751 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.502731 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 20:14:38.522067 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.522037 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7f6d57ffbc-fxbrb"] Apr 16 20:14:38.563996 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.563919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c31958fa-caf8-4c77-a312-e2d0f8238e6f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:38.563996 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.563987 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c31958fa-caf8-4c77-a312-e2d0f8238e6f-crio-socket\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:38.564318 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.564012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c31958fa-caf8-4c77-a312-e2d0f8238e6f-data-volume\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:38.564318 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.564029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c31958fa-caf8-4c77-a312-e2d0f8238e6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:38.564318 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.564060 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcf8j\" (UniqueName: \"kubernetes.io/projected/c31958fa-caf8-4c77-a312-e2d0f8238e6f-kube-api-access-kcf8j\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:38.564318 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.564089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c31958fa-caf8-4c77-a312-e2d0f8238e6f-crio-socket\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:38.564522 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.564359 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c31958fa-caf8-4c77-a312-e2d0f8238e6f-data-volume\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:38.664762 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.664721 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/91700673-6298-4117-96a8-e8b9068e453b-image-registry-private-configuration\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.664762 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.664764 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/91700673-6298-4117-96a8-e8b9068e453b-ca-trust-extracted\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.664963 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.664785 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktt2b\" (UniqueName: \"kubernetes.io/projected/91700673-6298-4117-96a8-e8b9068e453b-kube-api-access-ktt2b\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.664963 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.664869 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/91700673-6298-4117-96a8-e8b9068e453b-registry-tls\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.664963 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.664917 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/91700673-6298-4117-96a8-e8b9068e453b-registry-certificates\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.664963 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.664948 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91700673-6298-4117-96a8-e8b9068e453b-trusted-ca\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.665085 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.664969 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/91700673-6298-4117-96a8-e8b9068e453b-bound-sa-token\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.665085 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.664989 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/91700673-6298-4117-96a8-e8b9068e453b-installation-pull-secrets\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.765338 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.765298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/91700673-6298-4117-96a8-e8b9068e453b-image-registry-private-configuration\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.765338 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.765334 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/91700673-6298-4117-96a8-e8b9068e453b-ca-trust-extracted\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.765599 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.765352 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktt2b\" (UniqueName: \"kubernetes.io/projected/91700673-6298-4117-96a8-e8b9068e453b-kube-api-access-ktt2b\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.765599 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.765406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/91700673-6298-4117-96a8-e8b9068e453b-registry-tls\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.765599 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.765430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/91700673-6298-4117-96a8-e8b9068e453b-registry-certificates\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.765599 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.765448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91700673-6298-4117-96a8-e8b9068e453b-trusted-ca\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.765599 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.765466 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/91700673-6298-4117-96a8-e8b9068e453b-bound-sa-token\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.765599 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.765485 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/91700673-6298-4117-96a8-e8b9068e453b-installation-pull-secrets\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.765934 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.765798 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/91700673-6298-4117-96a8-e8b9068e453b-ca-trust-extracted\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.766489 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.766463 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/91700673-6298-4117-96a8-e8b9068e453b-registry-certificates\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.766596 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.766578 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91700673-6298-4117-96a8-e8b9068e453b-trusted-ca\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.767918 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.767888 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/91700673-6298-4117-96a8-e8b9068e453b-registry-tls\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.768023 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.767999 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/91700673-6298-4117-96a8-e8b9068e453b-installation-pull-secrets\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.768063 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.768042 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/91700673-6298-4117-96a8-e8b9068e453b-image-registry-private-configuration\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.781010 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.780985 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/91700673-6298-4117-96a8-e8b9068e453b-bound-sa-token\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.781236 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.781218 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktt2b\" (UniqueName: \"kubernetes.io/projected/91700673-6298-4117-96a8-e8b9068e453b-kube-api-access-ktt2b\") pod \"image-registry-7f6d57ffbc-fxbrb\" (UID: \"91700673-6298-4117-96a8-e8b9068e453b\") " pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.802747 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.802714 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:38.941701 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.941666 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7f6d57ffbc-fxbrb"] Apr 16 20:14:38.944995 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:14:38.944962 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91700673_6298_4117_96a8_e8b9068e453b.slice/crio-b117be44fc86a2b127e72d47cdf8952fd439b22b0a7e112bbef6172f3797086a WatchSource:0}: Error finding container b117be44fc86a2b127e72d47cdf8952fd439b22b0a7e112bbef6172f3797086a: Status 404 returned error can't find the container with id b117be44fc86a2b127e72d47cdf8952fd439b22b0a7e112bbef6172f3797086a Apr 16 20:14:38.967440 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.967414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:14:38.967528 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.967455 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:14:38.969687 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.969660 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91eb91d1-3690-4158-98a2-3eecf9955cda-metrics-tls\") pod \"dns-default-7f5fs\" (UID: \"91eb91d1-3690-4158-98a2-3eecf9955cda\") " pod="openshift-dns/dns-default-7f5fs" Apr 16 20:14:38.969770 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:38.969756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcf8c22-a25f-4709-a840-c85cec57a1b9-cert\") pod \"ingress-canary-2pcp4\" (UID: \"efcf8c22-a25f-4709-a840-c85cec57a1b9\") " pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:14:39.023258 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.023231 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jqqmg\"" Apr 16 20:14:39.023465 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.023453 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dstbq\"" Apr 16 20:14:39.031213 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.031190 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7f5fs" Apr 16 20:14:39.031213 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.031209 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2pcp4" Apr 16 20:14:39.136615 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.136586 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zvwln\"" Apr 16 20:14:39.159972 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.159940 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7f5fs"] Apr 16 20:14:39.162806 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:14:39.162780 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91eb91d1_3690_4158_98a2_3eecf9955cda.slice/crio-a22d6205ae595ad68ff635037adfc5ab32eaba8acff6126d96622c2081eba329 WatchSource:0}: Error finding container a22d6205ae595ad68ff635037adfc5ab32eaba8acff6126d96622c2081eba329: Status 404 returned error can't find the container with id a22d6205ae595ad68ff635037adfc5ab32eaba8acff6126d96622c2081eba329 Apr 16 20:14:39.171342 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.171317 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2pcp4"] Apr 16 20:14:39.175887 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:14:39.175852 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefcf8c22_a25f_4709_a840_c85cec57a1b9.slice/crio-d063a9c3c3f9d736866033a5ebb8ab2eae5f584b5990c8093093ed3dda750d73 WatchSource:0}: Error finding container d063a9c3c3f9d736866033a5ebb8ab2eae5f584b5990c8093093ed3dda750d73: Status 404 returned error can't find the container with id d063a9c3c3f9d736866033a5ebb8ab2eae5f584b5990c8093093ed3dda750d73 Apr 16 20:14:39.236941 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.236903 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2pcp4" event={"ID":"efcf8c22-a25f-4709-a840-c85cec57a1b9","Type":"ContainerStarted","Data":"d063a9c3c3f9d736866033a5ebb8ab2eae5f584b5990c8093093ed3dda750d73"} Apr 16 20:14:39.238125 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.238096 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" event={"ID":"91700673-6298-4117-96a8-e8b9068e453b","Type":"ContainerStarted","Data":"5b637140c4e52931b61384194b8ad86db30fdd1c781b5dc995de8bbd75377fc2"} Apr 16 20:14:39.238258 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.238130 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" event={"ID":"91700673-6298-4117-96a8-e8b9068e453b","Type":"ContainerStarted","Data":"b117be44fc86a2b127e72d47cdf8952fd439b22b0a7e112bbef6172f3797086a"} Apr 16 20:14:39.238258 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.238181 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:14:39.239183 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.239161 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7f5fs" event={"ID":"91eb91d1-3690-4158-98a2-3eecf9955cda","Type":"ContainerStarted","Data":"a22d6205ae595ad68ff635037adfc5ab32eaba8acff6126d96622c2081eba329"} Apr 16 20:14:39.259599 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.259537 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" podStartSLOduration=1.259518254 podStartE2EDuration="1.259518254s" podCreationTimestamp="2026-04-16 20:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:39.258889621 +0000 UTC m=+162.099790010" watchObservedRunningTime="2026-04-16 20:14:39.259518254 +0000 UTC m=+162.100418644" Apr 16 20:14:39.330774 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.330747 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:14:39.336774 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.336752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c31958fa-caf8-4c77-a312-e2d0f8238e6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:39.564803 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:39.564767 2572 configmap.go:193] Couldn't get configMap openshift-insights/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Apr 16 20:14:39.565000 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:39.564878 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c31958fa-caf8-4c77-a312-e2d0f8238e6f-kube-rbac-proxy-cm podName:c31958fa-caf8-4c77-a312-e2d0f8238e6f nodeName:}" failed. No retries permitted until 2026-04-16 20:14:40.064853727 +0000 UTC m=+162.905754115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-rbac-proxy-cm" (UniqueName: "kubernetes.io/configmap/c31958fa-caf8-4c77-a312-e2d0f8238e6f-kube-rbac-proxy-cm") pod "insights-runtime-extractor-5rcs6" (UID: "c31958fa-caf8-4c77-a312-e2d0f8238e6f") : failed to sync configmap cache: timed out waiting for the condition Apr 16 20:14:39.571662 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:39.571637 2572 projected.go:289] Couldn't get configMap openshift-insights/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 16 20:14:39.632179 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.632145 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:14:39.632406 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:39.632389 2572 projected.go:194] Error preparing data for projected volume kube-api-access-kcf8j for pod openshift-insights/insights-runtime-extractor-5rcs6: failed to sync configmap cache: timed out waiting for the condition Apr 16 20:14:39.632492 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:39.632480 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c31958fa-caf8-4c77-a312-e2d0f8238e6f-kube-api-access-kcf8j podName:c31958fa-caf8-4c77-a312-e2d0f8238e6f nodeName:}" failed. No retries permitted until 2026-04-16 20:14:40.13245691 +0000 UTC m=+162.973357299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kcf8j" (UniqueName: "kubernetes.io/projected/c31958fa-caf8-4c77-a312-e2d0f8238e6f-kube-api-access-kcf8j") pod "insights-runtime-extractor-5rcs6" (UID: "c31958fa-caf8-4c77-a312-e2d0f8238e6f") : failed to sync configmap cache: timed out waiting for the condition Apr 16 20:14:39.769738 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.769710 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:14:39.918922 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:39.918846 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:14:40.076494 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:40.076461 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c31958fa-caf8-4c77-a312-e2d0f8238e6f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:40.077081 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:40.077057 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c31958fa-caf8-4c77-a312-e2d0f8238e6f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:40.177194 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:40.177109 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcf8j\" (UniqueName: \"kubernetes.io/projected/c31958fa-caf8-4c77-a312-e2d0f8238e6f-kube-api-access-kcf8j\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:40.180018 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:40.179967 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcf8j\" (UniqueName: \"kubernetes.io/projected/c31958fa-caf8-4c77-a312-e2d0f8238e6f-kube-api-access-kcf8j\") pod \"insights-runtime-extractor-5rcs6\" (UID: \"c31958fa-caf8-4c77-a312-e2d0f8238e6f\") " pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:40.418300 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:40.418257 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5rcs6" Apr 16 20:14:41.102429 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:41.102404 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5rcs6"] Apr 16 20:14:41.105823 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:14:41.105794 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc31958fa_caf8_4c77_a312_e2d0f8238e6f.slice/crio-2ab61e8bb1c9b0e372277e51c1a5f1e06f796beaa99e35e4ae4fe5b1d83c5f2a WatchSource:0}: Error finding container 2ab61e8bb1c9b0e372277e51c1a5f1e06f796beaa99e35e4ae4fe5b1d83c5f2a: Status 404 returned error can't find the container with id 2ab61e8bb1c9b0e372277e51c1a5f1e06f796beaa99e35e4ae4fe5b1d83c5f2a Apr 16 20:14:41.247603 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:41.247524 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7f5fs" event={"ID":"91eb91d1-3690-4158-98a2-3eecf9955cda","Type":"ContainerStarted","Data":"b920778cab0e0f9b88023df7ff5efae3e7bf2a3bf1699e0b5f2d086913ae2801"} Apr 16 20:14:41.247603 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:41.247568 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7f5fs" event={"ID":"91eb91d1-3690-4158-98a2-3eecf9955cda","Type":"ContainerStarted","Data":"1dfbfc4318f96935bc7152ce90543977acec8801f8f371efb66366dd20ec9606"} Apr 16 20:14:41.247852 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:41.247681 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7f5fs" Apr 16 20:14:41.249249 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:41.249217 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2pcp4" event={"ID":"efcf8c22-a25f-4709-a840-c85cec57a1b9","Type":"ContainerStarted","Data":"7d660989a509cf3063dd0aa50142ddf81b9f618e782f80870b731a215df7521e"} Apr 16 20:14:41.250451 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:41.250429 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5rcs6" event={"ID":"c31958fa-caf8-4c77-a312-e2d0f8238e6f","Type":"ContainerStarted","Data":"129e4d38d7f571c669a36ee542a7fc2e5d9dbb5b3d96aef64f9a9ce04260f2a6"} Apr 16 20:14:41.250451 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:41.250454 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5rcs6" event={"ID":"c31958fa-caf8-4c77-a312-e2d0f8238e6f","Type":"ContainerStarted","Data":"2ab61e8bb1c9b0e372277e51c1a5f1e06f796beaa99e35e4ae4fe5b1d83c5f2a"} Apr 16 20:14:41.267788 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:41.267744 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7f5fs" podStartSLOduration=129.467651148 podStartE2EDuration="2m11.267732514s" podCreationTimestamp="2026-04-16 20:12:30 +0000 UTC" firstStartedPulling="2026-04-16 20:14:39.16448563 +0000 UTC m=+162.005386001" lastFinishedPulling="2026-04-16 20:14:40.964566998 +0000 UTC m=+163.805467367" observedRunningTime="2026-04-16 20:14:41.26673604 +0000 UTC m=+164.107636468" watchObservedRunningTime="2026-04-16 20:14:41.267732514 +0000 UTC m=+164.108632904" Apr 16 20:14:41.286006 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:41.285957 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2pcp4" podStartSLOduration=129.494279942 podStartE2EDuration="2m11.285946262s" podCreationTimestamp="2026-04-16 20:12:30 +0000 UTC" firstStartedPulling="2026-04-16 20:14:39.177625919 +0000 UTC m=+162.018526287" lastFinishedPulling="2026-04-16 20:14:40.969292044 +0000 UTC m=+163.810192607" observedRunningTime="2026-04-16 20:14:41.2857357 +0000 UTC m=+164.126636105" watchObservedRunningTime="2026-04-16 20:14:41.285946262 +0000 UTC m=+164.126846651" Apr 16 20:14:42.256036 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:42.255998 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5rcs6" event={"ID":"c31958fa-caf8-4c77-a312-e2d0f8238e6f","Type":"ContainerStarted","Data":"63324045da91c2e611a336279a9f69a047d10a2085415538e74ff5650358c772"} Apr 16 20:14:42.741436 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:42.741405 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:42.741625 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:42.741514 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:42.741847 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:42.741830 2572 scope.go:117] "RemoveContainer" containerID="4556f4e0d733068baa40bd6dad23be734274a6c5ed5e53b0c8f36cdda531a10b" Apr 16 20:14:42.742097 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:42.742068 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-j4pl4_openshift-console-operator(80a9a559-faef-43f2-ae15-5d0a784691b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" podUID="80a9a559-faef-43f2-ae15-5d0a784691b5" Apr 16 20:14:43.259961 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:43.259929 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5rcs6" event={"ID":"c31958fa-caf8-4c77-a312-e2d0f8238e6f","Type":"ContainerStarted","Data":"b4bd99c9e39bb67223484f3f59dba9fc7fadaa35ee57b0d3181245b54076808e"} Apr 16 20:14:43.260433 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:43.260148 2572 scope.go:117] "RemoveContainer" containerID="4556f4e0d733068baa40bd6dad23be734274a6c5ed5e53b0c8f36cdda531a10b" Apr 16 20:14:43.260433 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:43.260320 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-j4pl4_openshift-console-operator(80a9a559-faef-43f2-ae15-5d0a784691b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" podUID="80a9a559-faef-43f2-ae15-5d0a784691b5" Apr 16 20:14:43.282935 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:43.282890 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5rcs6" podStartSLOduration=3.355579555 podStartE2EDuration="5.282876749s" podCreationTimestamp="2026-04-16 20:14:38 +0000 UTC" firstStartedPulling="2026-04-16 20:14:41.180864437 +0000 UTC m=+164.021764811" lastFinishedPulling="2026-04-16 20:14:43.108161633 +0000 UTC m=+165.949062005" observedRunningTime="2026-04-16 20:14:43.282532322 +0000 UTC m=+166.123432709" watchObservedRunningTime="2026-04-16 20:14:43.282876749 +0000 UTC m=+166.123777160" Apr 16 20:14:50.758797 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:50.758701 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:14:51.258835 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:51.258806 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7f5fs" Apr 16 20:14:55.177976 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.177940 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6l62g"] Apr 16 20:14:55.181129 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.181105 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.184023 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.183999 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:14:55.184023 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.184000 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-q4rxk\"" Apr 16 20:14:55.184212 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.184106 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:14:55.184271 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.184231 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:14:55.186679 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.186646 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d756c161-ee06-43f3-8ef8-ba201a79c470-root\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.186770 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.186690 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-tls\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.186770 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.186720 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-accelerators-collector-config\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.186881 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.186792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-textfile\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.186881 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.186845 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2zbj\" (UniqueName: \"kubernetes.io/projected/d756c161-ee06-43f3-8ef8-ba201a79c470-kube-api-access-t2zbj\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.186881 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.186871 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-wtmp\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.187009 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.186900 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d756c161-ee06-43f3-8ef8-ba201a79c470-metrics-client-ca\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.187009 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.186934 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.187009 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.186990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d756c161-ee06-43f3-8ef8-ba201a79c470-sys\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.188034 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.187989 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:14:55.190091 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.190076 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:14:55.194270 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.194256 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:14:55.287221 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287187 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-wtmp\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.287221 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d756c161-ee06-43f3-8ef8-ba201a79c470-metrics-client-ca\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.287493 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.287493 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287278 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d756c161-ee06-43f3-8ef8-ba201a79c470-sys\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.287493 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287315 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d756c161-ee06-43f3-8ef8-ba201a79c470-root\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.287493 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-tls\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.287493 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-accelerators-collector-config\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.287493 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-wtmp\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.287493 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d756c161-ee06-43f3-8ef8-ba201a79c470-sys\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.287493 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287485 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d756c161-ee06-43f3-8ef8-ba201a79c470-root\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.287887 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:55.287507 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 20:14:55.287887 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287521 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-textfile\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.287887 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:14:55.287573 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-tls podName:d756c161-ee06-43f3-8ef8-ba201a79c470 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:55.787549669 +0000 UTC m=+178.628450037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-tls") pod "node-exporter-6l62g" (UID: "d756c161-ee06-43f3-8ef8-ba201a79c470") : secret "node-exporter-tls" not found Apr 16 20:14:55.287887 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2zbj\" (UniqueName: \"kubernetes.io/projected/d756c161-ee06-43f3-8ef8-ba201a79c470-kube-api-access-t2zbj\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.287887 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287817 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-textfile\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.288069 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.287973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d756c161-ee06-43f3-8ef8-ba201a79c470-metrics-client-ca\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.288493 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.288468 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-accelerators-collector-config\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.290643 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.290615 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.308841 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.308805 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2zbj\" (UniqueName: \"kubernetes.io/projected/d756c161-ee06-43f3-8ef8-ba201a79c470-kube-api-access-t2zbj\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.791363 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.791320 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-tls\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:55.793891 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:55.793852 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d756c161-ee06-43f3-8ef8-ba201a79c470-node-exporter-tls\") pod \"node-exporter-6l62g\" (UID: \"d756c161-ee06-43f3-8ef8-ba201a79c470\") " pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:56.090541 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.090449 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6l62g" Apr 16 20:14:56.099281 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:14:56.099250 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd756c161_ee06_43f3_8ef8_ba201a79c470.slice/crio-bc7b41b884979e2d77714ef4a1405004c1c507c365c1085025f9b0926233eb80 WatchSource:0}: Error finding container bc7b41b884979e2d77714ef4a1405004c1c507c365c1085025f9b0926233eb80: Status 404 returned error can't find the container with id bc7b41b884979e2d77714ef4a1405004c1c507c365c1085025f9b0926233eb80 Apr 16 20:14:56.171666 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.171634 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:14:56.176174 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.176158 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.178670 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.178645 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 20:14:56.179030 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.178647 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 20:14:56.179030 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.178649 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 20:14:56.179030 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.178655 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 20:14:56.179030 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.178682 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6d8cl\"" Apr 16 20:14:56.179030 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.178888 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 20:14:56.179030 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.178936 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 20:14:56.179030 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.178943 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 20:14:56.179030 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.178692 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 20:14:56.179030 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.178701 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 20:14:56.187451 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.187431 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:14:56.195658 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.195635 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.195745 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.195673 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.195745 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.195724 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/be3664f6-5510-4aef-8526-652175a879a9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.195814 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.195791 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-web-config\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.195814 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.195808 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be3664f6-5510-4aef-8526-652175a879a9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.195878 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.195825 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2nbv\" (UniqueName: \"kubernetes.io/projected/be3664f6-5510-4aef-8526-652175a879a9-kube-api-access-d2nbv\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.195878 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.195842 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/be3664f6-5510-4aef-8526-652175a879a9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.195878 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.195859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be3664f6-5510-4aef-8526-652175a879a9-config-out\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.195878 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.195875 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be3664f6-5510-4aef-8526-652175a879a9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.196007 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.195889 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.196007 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.195959 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.196007 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.195986 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-config-volume\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.196170 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.196010 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.294596 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.294555 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6l62g" event={"ID":"d756c161-ee06-43f3-8ef8-ba201a79c470","Type":"ContainerStarted","Data":"bc7b41b884979e2d77714ef4a1405004c1c507c365c1085025f9b0926233eb80"} Apr 16 20:14:56.296983 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.296958 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.297044 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.297002 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.297044 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.297035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/be3664f6-5510-4aef-8526-652175a879a9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.297146 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.297099 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-web-config\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.297146 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.297124 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be3664f6-5510-4aef-8526-652175a879a9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.297238 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.297173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2nbv\" (UniqueName: \"kubernetes.io/projected/be3664f6-5510-4aef-8526-652175a879a9-kube-api-access-d2nbv\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.297238 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.297203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/be3664f6-5510-4aef-8526-652175a879a9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.297330 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.297237 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be3664f6-5510-4aef-8526-652175a879a9-config-out\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.297330 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.297266 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be3664f6-5510-4aef-8526-652175a879a9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.297330 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.297291 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.297495 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.297335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.297495 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.297394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-config-volume\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.297495 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.297424 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.297645 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.297574 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/be3664f6-5510-4aef-8526-652175a879a9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.298115 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.298090 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be3664f6-5510-4aef-8526-652175a879a9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.298744 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.298717 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/be3664f6-5510-4aef-8526-652175a879a9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.300785 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.300754 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.301044 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.300951 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.301314 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.301273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be3664f6-5510-4aef-8526-652175a879a9-config-out\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.301460 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.301440 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.301535 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.301469 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.302086 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.302063 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be3664f6-5510-4aef-8526-652175a879a9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.302086 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.302083 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-config-volume\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.302249 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.302101 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-web-config\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.302596 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.302577 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.305466 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.305445 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2nbv\" (UniqueName: \"kubernetes.io/projected/be3664f6-5510-4aef-8526-652175a879a9-kube-api-access-d2nbv\") pod \"alertmanager-main-0\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.485876 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.485843 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:56.620330 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:56.620295 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:14:56.626982 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:14:56.626944 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe3664f6_5510_4aef_8526_652175a879a9.slice/crio-6efd6787ffeab02e1c39520ff3d7b8636fc3e53d307adae0f92c204e2b451913 WatchSource:0}: Error finding container 6efd6787ffeab02e1c39520ff3d7b8636fc3e53d307adae0f92c204e2b451913: Status 404 returned error can't find the container with id 6efd6787ffeab02e1c39520ff3d7b8636fc3e53d307adae0f92c204e2b451913 Apr 16 20:14:57.299620 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:57.299578 2572 generic.go:358] "Generic (PLEG): container finished" podID="d756c161-ee06-43f3-8ef8-ba201a79c470" containerID="27bc38c9c1a98ee3b654c5e941c1398ecbba29d6dc1b5cc4034048a64922cb43" exitCode=0 Apr 16 20:14:57.300083 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:57.299624 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6l62g" event={"ID":"d756c161-ee06-43f3-8ef8-ba201a79c470","Type":"ContainerDied","Data":"27bc38c9c1a98ee3b654c5e941c1398ecbba29d6dc1b5cc4034048a64922cb43"} Apr 16 20:14:57.301288 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:57.301262 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerStarted","Data":"6efd6787ffeab02e1c39520ff3d7b8636fc3e53d307adae0f92c204e2b451913"} Apr 16 20:14:58.306026 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:58.305990 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6l62g" event={"ID":"d756c161-ee06-43f3-8ef8-ba201a79c470","Type":"ContainerStarted","Data":"b29f0d0a1fd8607894bb64ad879140ec712f51289dd46bb6c3fc27256d1d391a"} Apr 16 20:14:58.306549 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:58.306127 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6l62g" event={"ID":"d756c161-ee06-43f3-8ef8-ba201a79c470","Type":"ContainerStarted","Data":"0bd24dd77d5859dee660b98641ea4ca176cc806dc1a57bedb258a5ddec014ec9"} Apr 16 20:14:58.307512 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:58.307484 2572 generic.go:358] "Generic (PLEG): container finished" podID="be3664f6-5510-4aef-8526-652175a879a9" containerID="5fbe38743fd96c25b1e873eb214931677e1255aa148ab60b2fad019569e318bc" exitCode=0 Apr 16 20:14:58.307612 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:58.307566 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerDied","Data":"5fbe38743fd96c25b1e873eb214931677e1255aa148ab60b2fad019569e318bc"} Apr 16 20:14:58.367431 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:58.367386 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6l62g" podStartSLOduration=2.572777363 podStartE2EDuration="3.367359302s" podCreationTimestamp="2026-04-16 20:14:55 +0000 UTC" firstStartedPulling="2026-04-16 20:14:56.101461474 +0000 UTC m=+178.942361842" lastFinishedPulling="2026-04-16 20:14:56.896043413 +0000 UTC m=+179.736943781" observedRunningTime="2026-04-16 20:14:58.328204571 +0000 UTC m=+181.169104971" watchObservedRunningTime="2026-04-16 20:14:58.367359302 +0000 UTC m=+181.208259691" Apr 16 20:14:58.758738 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:58.758711 2572 scope.go:117] "RemoveContainer" containerID="4556f4e0d733068baa40bd6dad23be734274a6c5ed5e53b0c8f36cdda531a10b" Apr 16 20:14:58.807951 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:58.807918 2572 patch_prober.go:28] interesting pod/image-registry-7f6d57ffbc-fxbrb container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 20:14:58.808072 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:58.807977 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" podUID="91700673-6298-4117-96a8-e8b9068e453b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:14:59.312868 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:59.312840 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:14:59.313476 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:59.313441 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" event={"ID":"80a9a559-faef-43f2-ae15-5d0a784691b5","Type":"ContainerStarted","Data":"b5a64e5d7690f4d94ba7c4532e151027d8a730ba2ca75717ecfb0799665b6cf4"} Apr 16 20:14:59.314046 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:59.314023 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:14:59.338842 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:59.338787 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" podStartSLOduration=45.070016906 podStartE2EDuration="47.338768473s" podCreationTimestamp="2026-04-16 20:14:12 +0000 UTC" firstStartedPulling="2026-04-16 20:14:12.857130221 +0000 UTC m=+135.698030588" lastFinishedPulling="2026-04-16 20:14:15.125881772 +0000 UTC m=+137.966782155" observedRunningTime="2026-04-16 20:14:59.336958869 +0000 UTC m=+182.177859280" watchObservedRunningTime="2026-04-16 20:14:59.338768473 +0000 UTC m=+182.179668864" Apr 16 20:14:59.929955 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:14:59.929932 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-j4pl4" Apr 16 20:15:00.246488 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:00.246461 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7f6d57ffbc-fxbrb" Apr 16 20:15:00.318954 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:00.318919 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerStarted","Data":"377ab0adc3efe5adf7de1aeda3874aed90db051a5ede5aacaa57f254076c57a8"} Apr 16 20:15:00.318954 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:00.318958 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerStarted","Data":"3ca91f3b5ef3d81e387aa4ce44f099dfa464e37bca0018946573611030324ea8"} Apr 16 20:15:00.319334 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:00.318973 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerStarted","Data":"a60c64e244603e86d343412d59a1964a20e2b4429bd502ffdaa09878de6f20f3"} Apr 16 20:15:00.319334 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:00.318986 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerStarted","Data":"08295d5230d83e6ca5fd2ebaf1284a885c8438a7a9ef75b3fd027d6946ef9295"} Apr 16 20:15:00.319334 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:00.318995 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerStarted","Data":"05f9bce97d57aeaae4904f0bb098671c0dc1ba54a937cc9827b62e364fdad872"} Apr 16 20:15:01.326507 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.326475 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerStarted","Data":"ec7810754ed2cb6b80fd616cf84272c07bfa9f0f6012178f4b828a120f1f7b36"} Apr 16 20:15:01.353636 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.353589 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.082850059 podStartE2EDuration="5.353575974s" podCreationTimestamp="2026-04-16 20:14:56 +0000 UTC" firstStartedPulling="2026-04-16 20:14:56.629952215 +0000 UTC m=+179.470852592" lastFinishedPulling="2026-04-16 20:15:00.900678122 +0000 UTC m=+183.741578507" observedRunningTime="2026-04-16 20:15:01.352512086 +0000 UTC m=+184.193412475" watchObservedRunningTime="2026-04-16 20:15:01.353575974 +0000 UTC m=+184.194476364" Apr 16 20:15:01.474102 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.474065 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:15:01.476620 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.476603 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.479060 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.479032 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 20:15:01.479193 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.479172 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 20:15:01.479275 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.479225 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-z5d9c\"" Apr 16 20:15:01.479346 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.479328 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 20:15:01.479452 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.479436 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 20:15:01.479729 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.479701 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 20:15:01.479830 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.479793 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 20:15:01.479986 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.479852 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 20:15:01.479986 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.479916 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bregfb1o3ebal\"" Apr 16 20:15:01.479986 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.479924 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 20:15:01.479986 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.479946 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 20:15:01.480426 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.480396 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 20:15:01.480504 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.480397 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 20:15:01.480504 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.480402 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 20:15:01.482656 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.482637 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 20:15:01.494961 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.494941 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:15:01.543476 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543442 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnt7h\" (UniqueName: \"kubernetes.io/projected/f49013a7-9f20-4940-a6e9-ae5840b19956-kube-api-access-vnt7h\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.543476 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543477 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f49013a7-9f20-4940-a6e9-ae5840b19956-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.543668 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543501 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.543668 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543525 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.543668 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543587 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.543668 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-web-config\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.543668 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543662 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.543817 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.543817 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543741 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.543817 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543791 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.543903 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.543903 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543841 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.543903 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543857 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.543903 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543893 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.544012 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543930 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f49013a7-9f20-4940-a6e9-ae5840b19956-config-out\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.544012 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543947 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-config\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.544012 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.543970 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.544093 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.544012 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.644994 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.644900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.644994 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.644960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.644994 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.644979 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.645274 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645003 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.645274 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645021 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.645274 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645040 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.645475 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.645475 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f49013a7-9f20-4940-a6e9-ae5840b19956-config-out\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.645475 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645358 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-config\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.645475 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.645475 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.646257 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnt7h\" (UniqueName: \"kubernetes.io/projected/f49013a7-9f20-4940-a6e9-ae5840b19956-kube-api-access-vnt7h\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.646257 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645521 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f49013a7-9f20-4940-a6e9-ae5840b19956-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.646257 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645563 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.646257 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.646257 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645809 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.646257 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-web-config\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.646257 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.645857 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.646257 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.646091 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.646257 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.646172 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.648398 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.646739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.648398 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.648243 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.649169 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.648687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.649169 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.648787 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.649169 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.648951 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.649169 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.649117 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.650329 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.649747 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.650329 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.650283 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f49013a7-9f20-4940-a6e9-ae5840b19956-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.650538 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.650513 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.650602 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.650521 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.651084 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.651044 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f49013a7-9f20-4940-a6e9-ae5840b19956-config-out\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.651303 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.651280 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.651785 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.651763 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.652157 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.652135 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-config\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.652742 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.652720 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-web-config\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.656058 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.656035 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnt7h\" (UniqueName: \"kubernetes.io/projected/f49013a7-9f20-4940-a6e9-ae5840b19956-kube-api-access-vnt7h\") pod \"prometheus-k8s-0\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.788824 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.788788 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:01.930001 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:01.929901 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:15:01.935338 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:15:01.935303 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49013a7_9f20_4940_a6e9_ae5840b19956.slice/crio-1c761ff3c1b5f7abfce0fcdd065db557514948037140ee37f1375602bd628c73 WatchSource:0}: Error finding container 1c761ff3c1b5f7abfce0fcdd065db557514948037140ee37f1375602bd628c73: Status 404 returned error can't find the container with id 1c761ff3c1b5f7abfce0fcdd065db557514948037140ee37f1375602bd628c73 Apr 16 20:15:02.330834 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:02.330796 2572 generic.go:358] "Generic (PLEG): container finished" podID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerID="7878e02afc282d68d77ec78a1383347963727ac12e578faf717248a7099ecb4d" exitCode=0 Apr 16 20:15:02.331198 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:02.330886 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerDied","Data":"7878e02afc282d68d77ec78a1383347963727ac12e578faf717248a7099ecb4d"} Apr 16 20:15:02.331198 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:02.330920 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerStarted","Data":"1c761ff3c1b5f7abfce0fcdd065db557514948037140ee37f1375602bd628c73"} Apr 16 20:15:05.341665 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:05.341630 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerStarted","Data":"f6111dd3c0d818c4510438ecf0cb68432001179e78d402f8228acee3916aec1c"} Apr 16 20:15:05.341665 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:05.341666 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerStarted","Data":"464f6392b9b3d1b3a549ce3e5a057d83bee4189c140f03c94d336fca2c82261b"} Apr 16 20:15:07.350716 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:07.350680 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerStarted","Data":"0aff502b304621f13b8a0970a733c3746a1d49b7c9cf8632b27669e81f5029ca"} Apr 16 20:15:07.350716 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:07.350717 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerStarted","Data":"4da116b212b0264f3d19242a6eb9620907c6891863f95a000836f5b99589ffa6"} Apr 16 20:15:07.351173 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:07.350730 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerStarted","Data":"56873d93686b56db0db9a2555c70f4636c5ba714fd97e27c4421a28ba7bb524b"} Apr 16 20:15:07.351173 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:07.350742 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerStarted","Data":"2cb7869d4b515e68bd160f5c25a92059397e7859b67f20a118e16164d755ae87"} Apr 16 20:15:07.390347 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:07.390301 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.120752624 podStartE2EDuration="6.390284062s" podCreationTimestamp="2026-04-16 20:15:01 +0000 UTC" firstStartedPulling="2026-04-16 20:15:02.332010253 +0000 UTC m=+185.172910621" lastFinishedPulling="2026-04-16 20:15:06.601541692 +0000 UTC m=+189.442442059" observedRunningTime="2026-04-16 20:15:07.38850696 +0000 UTC m=+190.229407352" watchObservedRunningTime="2026-04-16 20:15:07.390284062 +0000 UTC m=+190.231184453" Apr 16 20:15:11.789543 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:11.789509 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:36.428703 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:36.428667 2572 generic.go:358] "Generic (PLEG): container finished" podID="ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e" containerID="1aeb307a2048ac34d975df222c47d9169a442fb7c360752973e554aea134e227" exitCode=0 Apr 16 20:15:36.429235 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:36.428717 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" event={"ID":"ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e","Type":"ContainerDied","Data":"1aeb307a2048ac34d975df222c47d9169a442fb7c360752973e554aea134e227"} Apr 16 20:15:36.429235 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:36.428998 2572 scope.go:117] "RemoveContainer" containerID="1aeb307a2048ac34d975df222c47d9169a442fb7c360752973e554aea134e227" Apr 16 20:15:37.432918 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:15:37.432886 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2b77" event={"ID":"ae3fadf5-e6f2-4567-ac33-4eb8cbd2ef9e","Type":"ContainerStarted","Data":"6ded7ef5f23d7fcb12eee4f5c8aa8e2d86db13cb2eea06c5e8b171b2ea55d6f9"} Apr 16 20:16:01.789566 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:01.789531 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:01.808810 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:01.808785 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:02.513643 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:02.513607 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:09.492988 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:09.492951 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:16:09.495434 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:09.495410 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d285ba82-dded-4707-87cb-35b755280286-metrics-certs\") pod \"network-metrics-daemon-622d4\" (UID: \"d285ba82-dded-4707-87cb-35b755280286\") " pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:16:09.661738 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:09.661710 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q2prt\"" Apr 16 20:16:09.669872 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:09.669841 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-622d4" Apr 16 20:16:09.790295 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:09.790264 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-622d4"] Apr 16 20:16:10.522991 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:10.522957 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-622d4" event={"ID":"d285ba82-dded-4707-87cb-35b755280286","Type":"ContainerStarted","Data":"6261c1e8fe6f5a77e8ffb6d208286006226e4c2454aab00bde6468a1a7f2ee04"} Apr 16 20:16:11.527623 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:11.527589 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-622d4" event={"ID":"d285ba82-dded-4707-87cb-35b755280286","Type":"ContainerStarted","Data":"7f1915f413d3198c7288a9811739e27d8a5a1779d42a8d3eac2e898d7d604495"} Apr 16 20:16:11.527623 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:11.527624 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-622d4" event={"ID":"d285ba82-dded-4707-87cb-35b755280286","Type":"ContainerStarted","Data":"aae91c16fb642b2a7c5b5a1005a1066eb5f5db4f3cee5b1997f6a826900d9e99"} Apr 16 20:16:11.547080 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:11.547022 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-622d4" podStartSLOduration=253.512274207 podStartE2EDuration="4m14.547004637s" podCreationTimestamp="2026-04-16 20:11:57 +0000 UTC" firstStartedPulling="2026-04-16 20:16:09.797759767 +0000 UTC m=+252.638660135" lastFinishedPulling="2026-04-16 20:16:10.832490188 +0000 UTC m=+253.673390565" observedRunningTime="2026-04-16 20:16:11.544795578 +0000 UTC m=+254.385695980" watchObservedRunningTime="2026-04-16 20:16:11.547004637 +0000 UTC m=+254.387905028" Apr 16 20:16:15.542725 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:15.542691 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:16:15.543307 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:15.543277 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="alertmanager" containerID="cri-o://05f9bce97d57aeaae4904f0bb098671c0dc1ba54a937cc9827b62e364fdad872" gracePeriod=120 Apr 16 20:16:15.543423 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:15.543335 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="kube-rbac-proxy-metric" containerID="cri-o://377ab0adc3efe5adf7de1aeda3874aed90db051a5ede5aacaa57f254076c57a8" gracePeriod=120 Apr 16 20:16:15.543423 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:15.543357 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="kube-rbac-proxy" containerID="cri-o://3ca91f3b5ef3d81e387aa4ce44f099dfa464e37bca0018946573611030324ea8" gracePeriod=120 Apr 16 20:16:15.543423 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:15.543366 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="kube-rbac-proxy-web" containerID="cri-o://a60c64e244603e86d343412d59a1964a20e2b4429bd502ffdaa09878de6f20f3" gracePeriod=120 Apr 16 20:16:15.543586 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:15.543438 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="config-reloader" containerID="cri-o://08295d5230d83e6ca5fd2ebaf1284a885c8438a7a9ef75b3fd027d6946ef9295" gracePeriod=120 Apr 16 20:16:15.543586 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:15.543452 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="prom-label-proxy" containerID="cri-o://ec7810754ed2cb6b80fd616cf84272c07bfa9f0f6012178f4b828a120f1f7b36" gracePeriod=120 Apr 16 20:16:16.543332 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.543247 2572 generic.go:358] "Generic (PLEG): container finished" podID="be3664f6-5510-4aef-8526-652175a879a9" containerID="ec7810754ed2cb6b80fd616cf84272c07bfa9f0f6012178f4b828a120f1f7b36" exitCode=0 Apr 16 20:16:16.543332 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.543281 2572 generic.go:358] "Generic (PLEG): container finished" podID="be3664f6-5510-4aef-8526-652175a879a9" containerID="377ab0adc3efe5adf7de1aeda3874aed90db051a5ede5aacaa57f254076c57a8" exitCode=0 Apr 16 20:16:16.543332 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.543289 2572 generic.go:358] "Generic (PLEG): container finished" podID="be3664f6-5510-4aef-8526-652175a879a9" containerID="3ca91f3b5ef3d81e387aa4ce44f099dfa464e37bca0018946573611030324ea8" exitCode=0 Apr 16 20:16:16.543332 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.543295 2572 generic.go:358] "Generic (PLEG): container finished" podID="be3664f6-5510-4aef-8526-652175a879a9" containerID="08295d5230d83e6ca5fd2ebaf1284a885c8438a7a9ef75b3fd027d6946ef9295" exitCode=0 Apr 16 20:16:16.543332 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.543300 2572 generic.go:358] "Generic (PLEG): container finished" podID="be3664f6-5510-4aef-8526-652175a879a9" containerID="05f9bce97d57aeaae4904f0bb098671c0dc1ba54a937cc9827b62e364fdad872" exitCode=0 Apr 16 20:16:16.543802 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.543331 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerDied","Data":"ec7810754ed2cb6b80fd616cf84272c07bfa9f0f6012178f4b828a120f1f7b36"} Apr 16 20:16:16.543802 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.543369 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerDied","Data":"377ab0adc3efe5adf7de1aeda3874aed90db051a5ede5aacaa57f254076c57a8"} Apr 16 20:16:16.543802 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.543399 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerDied","Data":"3ca91f3b5ef3d81e387aa4ce44f099dfa464e37bca0018946573611030324ea8"} Apr 16 20:16:16.543802 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.543408 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerDied","Data":"08295d5230d83e6ca5fd2ebaf1284a885c8438a7a9ef75b3fd027d6946ef9295"} Apr 16 20:16:16.543802 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.543417 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerDied","Data":"05f9bce97d57aeaae4904f0bb098671c0dc1ba54a937cc9827b62e364fdad872"} Apr 16 20:16:16.795158 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.795099 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:16.856748 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.856721 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2nbv\" (UniqueName: \"kubernetes.io/projected/be3664f6-5510-4aef-8526-652175a879a9-kube-api-access-d2nbv\") pod \"be3664f6-5510-4aef-8526-652175a879a9\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " Apr 16 20:16:16.856926 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.856763 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/be3664f6-5510-4aef-8526-652175a879a9-metrics-client-ca\") pod \"be3664f6-5510-4aef-8526-652175a879a9\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " Apr 16 20:16:16.856926 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.856802 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-web-config\") pod \"be3664f6-5510-4aef-8526-652175a879a9\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " Apr 16 20:16:16.856926 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.856831 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/be3664f6-5510-4aef-8526-652175a879a9-alertmanager-main-db\") pod \"be3664f6-5510-4aef-8526-652175a879a9\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " Apr 16 20:16:16.856926 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.856855 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-cluster-tls-config\") pod \"be3664f6-5510-4aef-8526-652175a879a9\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " Apr 16 20:16:16.856926 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.856880 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be3664f6-5510-4aef-8526-652175a879a9-alertmanager-trusted-ca-bundle\") pod \"be3664f6-5510-4aef-8526-652175a879a9\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " Apr 16 20:16:16.857173 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.856925 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"be3664f6-5510-4aef-8526-652175a879a9\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " Apr 16 20:16:16.857173 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.856964 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy\") pod \"be3664f6-5510-4aef-8526-652175a879a9\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " Apr 16 20:16:16.857173 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.856997 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be3664f6-5510-4aef-8526-652175a879a9-config-out\") pod \"be3664f6-5510-4aef-8526-652175a879a9\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " Apr 16 20:16:16.857173 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.857031 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be3664f6-5510-4aef-8526-652175a879a9-tls-assets\") pod \"be3664f6-5510-4aef-8526-652175a879a9\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " Apr 16 20:16:16.857173 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.857062 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy-web\") pod \"be3664f6-5510-4aef-8526-652175a879a9\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " Apr 16 20:16:16.857173 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.857106 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-main-tls\") pod \"be3664f6-5510-4aef-8526-652175a879a9\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " Apr 16 20:16:16.857173 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.857130 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-config-volume\") pod \"be3664f6-5510-4aef-8526-652175a879a9\" (UID: \"be3664f6-5510-4aef-8526-652175a879a9\") " Apr 16 20:16:16.857546 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.857210 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be3664f6-5510-4aef-8526-652175a879a9-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "be3664f6-5510-4aef-8526-652175a879a9" (UID: "be3664f6-5510-4aef-8526-652175a879a9"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:16.857691 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.857660 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be3664f6-5510-4aef-8526-652175a879a9-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "be3664f6-5510-4aef-8526-652175a879a9" (UID: "be3664f6-5510-4aef-8526-652175a879a9"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:16.857827 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.857786 2572 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/be3664f6-5510-4aef-8526-652175a879a9-metrics-client-ca\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:16.857827 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.857809 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/be3664f6-5510-4aef-8526-652175a879a9-alertmanager-main-db\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:16.858471 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.858441 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be3664f6-5510-4aef-8526-652175a879a9-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "be3664f6-5510-4aef-8526-652175a879a9" (UID: "be3664f6-5510-4aef-8526-652175a879a9"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:16.860509 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.860474 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be3664f6-5510-4aef-8526-652175a879a9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "be3664f6-5510-4aef-8526-652175a879a9" (UID: "be3664f6-5510-4aef-8526-652175a879a9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:16.861030 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.860989 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be3664f6-5510-4aef-8526-652175a879a9-kube-api-access-d2nbv" (OuterVolumeSpecName: "kube-api-access-d2nbv") pod "be3664f6-5510-4aef-8526-652175a879a9" (UID: "be3664f6-5510-4aef-8526-652175a879a9"). InnerVolumeSpecName "kube-api-access-d2nbv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:16.861781 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.861739 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be3664f6-5510-4aef-8526-652175a879a9-config-out" (OuterVolumeSpecName: "config-out") pod "be3664f6-5510-4aef-8526-652175a879a9" (UID: "be3664f6-5510-4aef-8526-652175a879a9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:16.861781 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.861755 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "be3664f6-5510-4aef-8526-652175a879a9" (UID: "be3664f6-5510-4aef-8526-652175a879a9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:16.862008 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.861940 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "be3664f6-5510-4aef-8526-652175a879a9" (UID: "be3664f6-5510-4aef-8526-652175a879a9"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:16.862124 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.862059 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "be3664f6-5510-4aef-8526-652175a879a9" (UID: "be3664f6-5510-4aef-8526-652175a879a9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:16.862755 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.862735 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "be3664f6-5510-4aef-8526-652175a879a9" (UID: "be3664f6-5510-4aef-8526-652175a879a9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:16.863356 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.863328 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-config-volume" (OuterVolumeSpecName: "config-volume") pod "be3664f6-5510-4aef-8526-652175a879a9" (UID: "be3664f6-5510-4aef-8526-652175a879a9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:16.864834 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.864812 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "be3664f6-5510-4aef-8526-652175a879a9" (UID: "be3664f6-5510-4aef-8526-652175a879a9"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:16.871194 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.871174 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-web-config" (OuterVolumeSpecName: "web-config") pod "be3664f6-5510-4aef-8526-652175a879a9" (UID: "be3664f6-5510-4aef-8526-652175a879a9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:16.958721 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.958687 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-web-config\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:16.958721 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.958717 2572 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-cluster-tls-config\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:16.958721 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.958729 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be3664f6-5510-4aef-8526-652175a879a9-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:16.958938 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.958740 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:16.958938 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.958751 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:16.958938 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.958760 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be3664f6-5510-4aef-8526-652175a879a9-config-out\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:16.958938 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.958768 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be3664f6-5510-4aef-8526-652175a879a9-tls-assets\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:16.958938 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.958776 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:16.958938 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.958785 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-secret-alertmanager-main-tls\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:16.958938 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.958793 2572 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/be3664f6-5510-4aef-8526-652175a879a9-config-volume\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:16.958938 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:16.958802 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d2nbv\" (UniqueName: \"kubernetes.io/projected/be3664f6-5510-4aef-8526-652175a879a9-kube-api-access-d2nbv\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:17.549450 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.549419 2572 generic.go:358] "Generic (PLEG): container finished" podID="be3664f6-5510-4aef-8526-652175a879a9" containerID="a60c64e244603e86d343412d59a1964a20e2b4429bd502ffdaa09878de6f20f3" exitCode=0 Apr 16 20:16:17.549846 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.549455 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerDied","Data":"a60c64e244603e86d343412d59a1964a20e2b4429bd502ffdaa09878de6f20f3"} Apr 16 20:16:17.549846 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.549477 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be3664f6-5510-4aef-8526-652175a879a9","Type":"ContainerDied","Data":"6efd6787ffeab02e1c39520ff3d7b8636fc3e53d307adae0f92c204e2b451913"} Apr 16 20:16:17.549846 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.549493 2572 scope.go:117] "RemoveContainer" containerID="ec7810754ed2cb6b80fd616cf84272c07bfa9f0f6012178f4b828a120f1f7b36" Apr 16 20:16:17.549846 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.549524 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.559842 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.559824 2572 scope.go:117] "RemoveContainer" containerID="377ab0adc3efe5adf7de1aeda3874aed90db051a5ede5aacaa57f254076c57a8" Apr 16 20:16:17.566912 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.566890 2572 scope.go:117] "RemoveContainer" containerID="3ca91f3b5ef3d81e387aa4ce44f099dfa464e37bca0018946573611030324ea8" Apr 16 20:16:17.573080 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.573062 2572 scope.go:117] "RemoveContainer" containerID="a60c64e244603e86d343412d59a1964a20e2b4429bd502ffdaa09878de6f20f3" Apr 16 20:16:17.575884 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.575866 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:16:17.580292 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.580268 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:16:17.581052 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.581037 2572 scope.go:117] "RemoveContainer" containerID="08295d5230d83e6ca5fd2ebaf1284a885c8438a7a9ef75b3fd027d6946ef9295" Apr 16 20:16:17.587352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.587323 2572 scope.go:117] "RemoveContainer" containerID="05f9bce97d57aeaae4904f0bb098671c0dc1ba54a937cc9827b62e364fdad872" Apr 16 20:16:17.593581 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.593565 2572 scope.go:117] "RemoveContainer" containerID="5fbe38743fd96c25b1e873eb214931677e1255aa148ab60b2fad019569e318bc" Apr 16 20:16:17.599592 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.599568 2572 scope.go:117] "RemoveContainer" containerID="ec7810754ed2cb6b80fd616cf84272c07bfa9f0f6012178f4b828a120f1f7b36" Apr 16 20:16:17.599842 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:17.599825 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7810754ed2cb6b80fd616cf84272c07bfa9f0f6012178f4b828a120f1f7b36\": container with ID starting with ec7810754ed2cb6b80fd616cf84272c07bfa9f0f6012178f4b828a120f1f7b36 not found: ID does not exist" containerID="ec7810754ed2cb6b80fd616cf84272c07bfa9f0f6012178f4b828a120f1f7b36" Apr 16 20:16:17.599896 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.599850 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7810754ed2cb6b80fd616cf84272c07bfa9f0f6012178f4b828a120f1f7b36"} err="failed to get container status \"ec7810754ed2cb6b80fd616cf84272c07bfa9f0f6012178f4b828a120f1f7b36\": rpc error: code = NotFound desc = could not find container \"ec7810754ed2cb6b80fd616cf84272c07bfa9f0f6012178f4b828a120f1f7b36\": container with ID starting with ec7810754ed2cb6b80fd616cf84272c07bfa9f0f6012178f4b828a120f1f7b36 not found: ID does not exist" Apr 16 20:16:17.599896 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.599882 2572 scope.go:117] "RemoveContainer" containerID="377ab0adc3efe5adf7de1aeda3874aed90db051a5ede5aacaa57f254076c57a8" Apr 16 20:16:17.600090 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:17.600072 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"377ab0adc3efe5adf7de1aeda3874aed90db051a5ede5aacaa57f254076c57a8\": container with ID starting with 377ab0adc3efe5adf7de1aeda3874aed90db051a5ede5aacaa57f254076c57a8 not found: ID does not exist" containerID="377ab0adc3efe5adf7de1aeda3874aed90db051a5ede5aacaa57f254076c57a8" Apr 16 20:16:17.600129 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.600096 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"377ab0adc3efe5adf7de1aeda3874aed90db051a5ede5aacaa57f254076c57a8"} err="failed to get container status \"377ab0adc3efe5adf7de1aeda3874aed90db051a5ede5aacaa57f254076c57a8\": rpc error: code = NotFound desc = could not find container \"377ab0adc3efe5adf7de1aeda3874aed90db051a5ede5aacaa57f254076c57a8\": container with ID starting with 377ab0adc3efe5adf7de1aeda3874aed90db051a5ede5aacaa57f254076c57a8 not found: ID does not exist" Apr 16 20:16:17.600129 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.600113 2572 scope.go:117] "RemoveContainer" containerID="3ca91f3b5ef3d81e387aa4ce44f099dfa464e37bca0018946573611030324ea8" Apr 16 20:16:17.600308 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:17.600293 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ca91f3b5ef3d81e387aa4ce44f099dfa464e37bca0018946573611030324ea8\": container with ID starting with 3ca91f3b5ef3d81e387aa4ce44f099dfa464e37bca0018946573611030324ea8 not found: ID does not exist" containerID="3ca91f3b5ef3d81e387aa4ce44f099dfa464e37bca0018946573611030324ea8" Apr 16 20:16:17.600349 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.600312 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca91f3b5ef3d81e387aa4ce44f099dfa464e37bca0018946573611030324ea8"} err="failed to get container status \"3ca91f3b5ef3d81e387aa4ce44f099dfa464e37bca0018946573611030324ea8\": rpc error: code = NotFound desc = could not find container \"3ca91f3b5ef3d81e387aa4ce44f099dfa464e37bca0018946573611030324ea8\": container with ID starting with 3ca91f3b5ef3d81e387aa4ce44f099dfa464e37bca0018946573611030324ea8 not found: ID does not exist" Apr 16 20:16:17.600349 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.600325 2572 scope.go:117] "RemoveContainer" containerID="a60c64e244603e86d343412d59a1964a20e2b4429bd502ffdaa09878de6f20f3" Apr 16 20:16:17.600537 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:17.600520 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60c64e244603e86d343412d59a1964a20e2b4429bd502ffdaa09878de6f20f3\": container with ID starting with a60c64e244603e86d343412d59a1964a20e2b4429bd502ffdaa09878de6f20f3 not found: ID does not exist" containerID="a60c64e244603e86d343412d59a1964a20e2b4429bd502ffdaa09878de6f20f3" Apr 16 20:16:17.600595 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.600544 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60c64e244603e86d343412d59a1964a20e2b4429bd502ffdaa09878de6f20f3"} err="failed to get container status \"a60c64e244603e86d343412d59a1964a20e2b4429bd502ffdaa09878de6f20f3\": rpc error: code = NotFound desc = could not find container \"a60c64e244603e86d343412d59a1964a20e2b4429bd502ffdaa09878de6f20f3\": container with ID starting with a60c64e244603e86d343412d59a1964a20e2b4429bd502ffdaa09878de6f20f3 not found: ID does not exist" Apr 16 20:16:17.600595 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.600557 2572 scope.go:117] "RemoveContainer" containerID="08295d5230d83e6ca5fd2ebaf1284a885c8438a7a9ef75b3fd027d6946ef9295" Apr 16 20:16:17.600783 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:17.600767 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08295d5230d83e6ca5fd2ebaf1284a885c8438a7a9ef75b3fd027d6946ef9295\": container with ID starting with 08295d5230d83e6ca5fd2ebaf1284a885c8438a7a9ef75b3fd027d6946ef9295 not found: ID does not exist" containerID="08295d5230d83e6ca5fd2ebaf1284a885c8438a7a9ef75b3fd027d6946ef9295" Apr 16 20:16:17.600816 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.600786 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08295d5230d83e6ca5fd2ebaf1284a885c8438a7a9ef75b3fd027d6946ef9295"} err="failed to get container status \"08295d5230d83e6ca5fd2ebaf1284a885c8438a7a9ef75b3fd027d6946ef9295\": rpc error: code = NotFound desc = could not find container \"08295d5230d83e6ca5fd2ebaf1284a885c8438a7a9ef75b3fd027d6946ef9295\": container with ID starting with 08295d5230d83e6ca5fd2ebaf1284a885c8438a7a9ef75b3fd027d6946ef9295 not found: ID does not exist" Apr 16 20:16:17.600816 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.600800 2572 scope.go:117] "RemoveContainer" containerID="05f9bce97d57aeaae4904f0bb098671c0dc1ba54a937cc9827b62e364fdad872" Apr 16 20:16:17.601009 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:17.600991 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05f9bce97d57aeaae4904f0bb098671c0dc1ba54a937cc9827b62e364fdad872\": container with ID starting with 05f9bce97d57aeaae4904f0bb098671c0dc1ba54a937cc9827b62e364fdad872 not found: ID does not exist" containerID="05f9bce97d57aeaae4904f0bb098671c0dc1ba54a937cc9827b62e364fdad872" Apr 16 20:16:17.601120 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.601010 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f9bce97d57aeaae4904f0bb098671c0dc1ba54a937cc9827b62e364fdad872"} err="failed to get container status \"05f9bce97d57aeaae4904f0bb098671c0dc1ba54a937cc9827b62e364fdad872\": rpc error: code = NotFound desc = could not find container \"05f9bce97d57aeaae4904f0bb098671c0dc1ba54a937cc9827b62e364fdad872\": container with ID starting with 05f9bce97d57aeaae4904f0bb098671c0dc1ba54a937cc9827b62e364fdad872 not found: ID does not exist" Apr 16 20:16:17.601120 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.601023 2572 scope.go:117] "RemoveContainer" containerID="5fbe38743fd96c25b1e873eb214931677e1255aa148ab60b2fad019569e318bc" Apr 16 20:16:17.601309 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:17.601294 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fbe38743fd96c25b1e873eb214931677e1255aa148ab60b2fad019569e318bc\": container with ID starting with 5fbe38743fd96c25b1e873eb214931677e1255aa148ab60b2fad019569e318bc not found: ID does not exist" containerID="5fbe38743fd96c25b1e873eb214931677e1255aa148ab60b2fad019569e318bc" Apr 16 20:16:17.601354 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.601312 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fbe38743fd96c25b1e873eb214931677e1255aa148ab60b2fad019569e318bc"} err="failed to get container status \"5fbe38743fd96c25b1e873eb214931677e1255aa148ab60b2fad019569e318bc\": rpc error: code = NotFound desc = could not find container \"5fbe38743fd96c25b1e873eb214931677e1255aa148ab60b2fad019569e318bc\": container with ID starting with 5fbe38743fd96c25b1e873eb214931677e1255aa148ab60b2fad019569e318bc not found: ID does not exist" Apr 16 20:16:17.611873 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.611849 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:16:17.612195 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612179 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="kube-rbac-proxy" Apr 16 20:16:17.612244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612197 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="kube-rbac-proxy" Apr 16 20:16:17.612244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612208 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="config-reloader" Apr 16 20:16:17.612244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612215 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="config-reloader" Apr 16 20:16:17.612244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612223 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="kube-rbac-proxy-web" Apr 16 20:16:17.612244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612229 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="kube-rbac-proxy-web" Apr 16 20:16:17.612244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612242 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="alertmanager" Apr 16 20:16:17.612244 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612247 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="alertmanager" Apr 16 20:16:17.612463 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612256 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="init-config-reloader" Apr 16 20:16:17.612463 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612261 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="init-config-reloader" Apr 16 20:16:17.612463 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612268 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="prom-label-proxy" Apr 16 20:16:17.612463 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612273 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="prom-label-proxy" Apr 16 20:16:17.612463 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612280 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="kube-rbac-proxy-metric" Apr 16 20:16:17.612463 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612285 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="kube-rbac-proxy-metric" Apr 16 20:16:17.612463 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612325 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="kube-rbac-proxy-web" Apr 16 20:16:17.612463 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612333 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="kube-rbac-proxy" Apr 16 20:16:17.612463 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612340 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="config-reloader" Apr 16 20:16:17.612463 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612346 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="prom-label-proxy" Apr 16 20:16:17.612463 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612353 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="alertmanager" Apr 16 20:16:17.612463 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.612358 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="be3664f6-5510-4aef-8526-652175a879a9" containerName="kube-rbac-proxy-metric" Apr 16 20:16:17.615433 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.615420 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.618809 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.618781 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 20:16:17.618909 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.618825 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 20:16:17.618909 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.618861 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 20:16:17.619071 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.619055 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 20:16:17.619155 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.619137 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 20:16:17.619259 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.619242 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 20:16:17.619315 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.619293 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 20:16:17.619361 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.619348 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 20:16:17.619420 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.619350 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6d8cl\"" Apr 16 20:16:17.626191 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.626173 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 20:16:17.630816 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.630795 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:16:17.663251 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.663224 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.663349 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.663255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.663349 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.663278 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.663349 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.663305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c5bf8133-b462-4194-98ad-2c8df9714e07-config-out\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.663533 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.663357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-web-config\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.663533 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.663442 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5bf8133-b462-4194-98ad-2c8df9714e07-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.663533 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.663468 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.663533 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.663485 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brd98\" (UniqueName: \"kubernetes.io/projected/c5bf8133-b462-4194-98ad-2c8df9714e07-kube-api-access-brd98\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.663533 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.663515 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-config-volume\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.663763 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.663539 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.663763 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.663608 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c5bf8133-b462-4194-98ad-2c8df9714e07-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.663763 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.663632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c5bf8133-b462-4194-98ad-2c8df9714e07-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.663763 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.663659 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5bf8133-b462-4194-98ad-2c8df9714e07-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.764878 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.764843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5bf8133-b462-4194-98ad-2c8df9714e07-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.765063 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.764892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.765063 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.764956 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brd98\" (UniqueName: \"kubernetes.io/projected/c5bf8133-b462-4194-98ad-2c8df9714e07-kube-api-access-brd98\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.765063 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.764990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-config-volume\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.765063 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.765031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.765424 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.765390 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c5bf8133-b462-4194-98ad-2c8df9714e07-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.765497 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.765440 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c5bf8133-b462-4194-98ad-2c8df9714e07-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.765497 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.765468 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5bf8133-b462-4194-98ad-2c8df9714e07-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.765600 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.765516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.765600 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.765545 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.765600 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.765573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.765752 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.765618 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c5bf8133-b462-4194-98ad-2c8df9714e07-config-out\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.765752 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.765643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-web-config\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.765752 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.765674 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5bf8133-b462-4194-98ad-2c8df9714e07-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.767285 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.767253 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5bf8133-b462-4194-98ad-2c8df9714e07-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.767661 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.767634 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c5bf8133-b462-4194-98ad-2c8df9714e07-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.768131 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.768066 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be3664f6-5510-4aef-8526-652175a879a9" path="/var/lib/kubelet/pods/be3664f6-5510-4aef-8526-652175a879a9/volumes" Apr 16 20:16:17.768604 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.768579 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c5bf8133-b462-4194-98ad-2c8df9714e07-config-out\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.768777 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.768744 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c5bf8133-b462-4194-98ad-2c8df9714e07-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.769457 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.769413 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-web-config\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.769560 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.769533 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-config-volume\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.769626 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.769560 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.770076 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.770056 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.770343 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.770318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.770653 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.770630 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.771023 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.771001 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c5bf8133-b462-4194-98ad-2c8df9714e07-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.773110 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.773084 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brd98\" (UniqueName: \"kubernetes.io/projected/c5bf8133-b462-4194-98ad-2c8df9714e07-kube-api-access-brd98\") pod \"alertmanager-main-0\" (UID: \"c5bf8133-b462-4194-98ad-2c8df9714e07\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:17.924906 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:17.924829 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:18.050091 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:18.050065 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:16:18.053581 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:16:18.053552 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5bf8133_b462_4194_98ad_2c8df9714e07.slice/crio-3b67db79bfbc1e4933ac5e9b4b84d5d8ec3a0c6767d3b6f1c28dd86957e2fa48 WatchSource:0}: Error finding container 3b67db79bfbc1e4933ac5e9b4b84d5d8ec3a0c6767d3b6f1c28dd86957e2fa48: Status 404 returned error can't find the container with id 3b67db79bfbc1e4933ac5e9b4b84d5d8ec3a0c6767d3b6f1c28dd86957e2fa48 Apr 16 20:16:18.554536 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:18.554502 2572 generic.go:358] "Generic (PLEG): container finished" podID="c5bf8133-b462-4194-98ad-2c8df9714e07" containerID="4d8cdbbbcc1380e87078aa1e2daeeb16864f7f17d31f181e1041a29938e48275" exitCode=0 Apr 16 20:16:18.554870 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:18.554590 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5bf8133-b462-4194-98ad-2c8df9714e07","Type":"ContainerDied","Data":"4d8cdbbbcc1380e87078aa1e2daeeb16864f7f17d31f181e1041a29938e48275"} Apr 16 20:16:18.554870 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:18.554628 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5bf8133-b462-4194-98ad-2c8df9714e07","Type":"ContainerStarted","Data":"3b67db79bfbc1e4933ac5e9b4b84d5d8ec3a0c6767d3b6f1c28dd86957e2fa48"} Apr 16 20:16:19.560059 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.560022 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5bf8133-b462-4194-98ad-2c8df9714e07","Type":"ContainerStarted","Data":"a9fee37c7fe36df25dc24984d78d3792cd34e096723a7fa8e8edff323398d0b2"} Apr 16 20:16:19.560402 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.560065 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5bf8133-b462-4194-98ad-2c8df9714e07","Type":"ContainerStarted","Data":"f1dc1afc429f86d5ac13999f5644ca9e7c47a12815aa72ae9fbe7a6922558c0e"} Apr 16 20:16:19.560402 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.560080 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5bf8133-b462-4194-98ad-2c8df9714e07","Type":"ContainerStarted","Data":"3b1cc8b22a46fc3756c2db656b0ee1dadcd9e236702ce7fddcd9e19d0c7c022c"} Apr 16 20:16:19.560402 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.560105 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5bf8133-b462-4194-98ad-2c8df9714e07","Type":"ContainerStarted","Data":"e963c44313a35d7b954eb4fb7c2bc1dd9c9dbd5ecfb461d4d2d4f04e69391e75"} Apr 16 20:16:19.560402 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.560116 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5bf8133-b462-4194-98ad-2c8df9714e07","Type":"ContainerStarted","Data":"e96687b55e2e5111f4b34ba7b3dc4490db2b7281da6eb558815e49232e84c4d4"} Apr 16 20:16:19.560402 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.560128 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5bf8133-b462-4194-98ad-2c8df9714e07","Type":"ContainerStarted","Data":"9ac115efe83b8c7ac545e7917fb51217d252d45331c6d10cc0f7cfa07e75bfe6"} Apr 16 20:16:19.591883 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.591837 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.591820132 podStartE2EDuration="2.591820132s" podCreationTimestamp="2026-04-16 20:16:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:16:19.589496942 +0000 UTC m=+262.430397330" watchObservedRunningTime="2026-04-16 20:16:19.591820132 +0000 UTC m=+262.432720520" Apr 16 20:16:19.769413 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.769365 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:16:19.769786 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.769763 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="prometheus" containerID="cri-o://464f6392b9b3d1b3a549ce3e5a057d83bee4189c140f03c94d336fca2c82261b" gracePeriod=600 Apr 16 20:16:19.769857 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.769790 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="kube-rbac-proxy" containerID="cri-o://4da116b212b0264f3d19242a6eb9620907c6891863f95a000836f5b99589ffa6" gracePeriod=600 Apr 16 20:16:19.769857 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.769815 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="thanos-sidecar" containerID="cri-o://2cb7869d4b515e68bd160f5c25a92059397e7859b67f20a118e16164d755ae87" gracePeriod=600 Apr 16 20:16:19.769857 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.769828 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="config-reloader" containerID="cri-o://f6111dd3c0d818c4510438ecf0cb68432001179e78d402f8228acee3916aec1c" gracePeriod=600 Apr 16 20:16:19.769857 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.769839 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="kube-rbac-proxy-thanos" containerID="cri-o://0aff502b304621f13b8a0970a733c3746a1d49b7c9cf8632b27669e81f5029ca" gracePeriod=600 Apr 16 20:16:19.770076 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:19.769929 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="kube-rbac-proxy-web" containerID="cri-o://56873d93686b56db0db9a2555c70f4636c5ba714fd97e27c4421a28ba7bb524b" gracePeriod=600 Apr 16 20:16:20.569886 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:20.569856 2572 generic.go:358] "Generic (PLEG): container finished" podID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerID="0aff502b304621f13b8a0970a733c3746a1d49b7c9cf8632b27669e81f5029ca" exitCode=0 Apr 16 20:16:20.569886 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:20.569879 2572 generic.go:358] "Generic (PLEG): container finished" podID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerID="4da116b212b0264f3d19242a6eb9620907c6891863f95a000836f5b99589ffa6" exitCode=0 Apr 16 20:16:20.569886 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:20.569887 2572 generic.go:358] "Generic (PLEG): container finished" podID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerID="2cb7869d4b515e68bd160f5c25a92059397e7859b67f20a118e16164d755ae87" exitCode=0 Apr 16 20:16:20.569886 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:20.569893 2572 generic.go:358] "Generic (PLEG): container finished" podID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerID="f6111dd3c0d818c4510438ecf0cb68432001179e78d402f8228acee3916aec1c" exitCode=0 Apr 16 20:16:20.570350 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:20.569899 2572 generic.go:358] "Generic (PLEG): container finished" podID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerID="464f6392b9b3d1b3a549ce3e5a057d83bee4189c140f03c94d336fca2c82261b" exitCode=0 Apr 16 20:16:20.570350 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:20.569924 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerDied","Data":"0aff502b304621f13b8a0970a733c3746a1d49b7c9cf8632b27669e81f5029ca"} Apr 16 20:16:20.570350 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:20.569956 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerDied","Data":"4da116b212b0264f3d19242a6eb9620907c6891863f95a000836f5b99589ffa6"} Apr 16 20:16:20.570350 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:20.569965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerDied","Data":"2cb7869d4b515e68bd160f5c25a92059397e7859b67f20a118e16164d755ae87"} Apr 16 20:16:20.570350 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:20.569974 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerDied","Data":"f6111dd3c0d818c4510438ecf0cb68432001179e78d402f8228acee3916aec1c"} Apr 16 20:16:20.570350 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:20.569981 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerDied","Data":"464f6392b9b3d1b3a549ce3e5a057d83bee4189c140f03c94d336fca2c82261b"} Apr 16 20:16:21.010345 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.010321 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.093686 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093657 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.093846 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093694 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-web-config\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.093846 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093718 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-metrics-client-ca\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.093846 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093735 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f49013a7-9f20-4940-a6e9-ae5840b19956-tls-assets\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.093846 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093767 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-grpc-tls\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.093846 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093800 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-tls\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.093846 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093826 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-k8s-db\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.093846 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093846 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-thanos-prometheus-http-client-file\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.094211 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093865 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-metrics-client-certs\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.094211 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093883 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-kubelet-serving-ca-bundle\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.094211 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093903 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-trusted-ca-bundle\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.094211 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093924 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnt7h\" (UniqueName: \"kubernetes.io/projected/f49013a7-9f20-4940-a6e9-ae5840b19956-kube-api-access-vnt7h\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.094211 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093938 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f49013a7-9f20-4940-a6e9-ae5840b19956-config-out\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.094211 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093963 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-serving-certs-ca-bundle\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.094211 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.093997 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.094211 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.094057 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-kube-rbac-proxy\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.094211 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.094087 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-config\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.094211 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.094140 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-k8s-rulefiles-0\") pod \"f49013a7-9f20-4940-a6e9-ae5840b19956\" (UID: \"f49013a7-9f20-4940-a6e9-ae5840b19956\") " Apr 16 20:16:21.094211 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.094134 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:21.094777 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.094559 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-metrics-client-ca\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.096659 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.095807 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:21.096659 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.096190 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:21.096659 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.096554 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:21.096659 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.096623 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:21.096659 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.096640 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:21.097037 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.097003 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f49013a7-9f20-4940-a6e9-ae5840b19956-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:21.097428 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.097389 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:21.097693 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.097660 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:21.098227 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.098201 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:21.098872 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.098828 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:21.098998 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.098923 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:21.098998 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.098939 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f49013a7-9f20-4940-a6e9-ae5840b19956-config-out" (OuterVolumeSpecName: "config-out") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:21.098998 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.098939 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:21.099310 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.099294 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:21.099856 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.099834 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f49013a7-9f20-4940-a6e9-ae5840b19956-kube-api-access-vnt7h" (OuterVolumeSpecName: "kube-api-access-vnt7h") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "kube-api-access-vnt7h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:21.099856 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.099844 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-config" (OuterVolumeSpecName: "config") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:21.108709 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.108688 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-web-config" (OuterVolumeSpecName: "web-config") pod "f49013a7-9f20-4940-a6e9-ae5840b19956" (UID: "f49013a7-9f20-4940-a6e9-ae5840b19956"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:21.195831 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195795 2572 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-grpc-tls\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.195831 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195827 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-tls\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.195831 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195838 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-k8s-db\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195847 2572 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-thanos-prometheus-http-client-file\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195857 2572 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-metrics-client-certs\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195865 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195875 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-trusted-ca-bundle\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195883 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vnt7h\" (UniqueName: \"kubernetes.io/projected/f49013a7-9f20-4940-a6e9-ae5840b19956-kube-api-access-vnt7h\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195892 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f49013a7-9f20-4940-a6e9-ae5840b19956-config-out\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195900 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195908 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195917 2572 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-kube-rbac-proxy\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195925 2572 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-config\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195933 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f49013a7-9f20-4940-a6e9-ae5840b19956-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195943 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195951 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f49013a7-9f20-4940-a6e9-ae5840b19956-web-config\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.196033 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.195959 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f49013a7-9f20-4940-a6e9-ae5840b19956-tls-assets\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:16:21.575123 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.575072 2572 generic.go:358] "Generic (PLEG): container finished" podID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerID="56873d93686b56db0db9a2555c70f4636c5ba714fd97e27c4421a28ba7bb524b" exitCode=0 Apr 16 20:16:21.575483 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.575139 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerDied","Data":"56873d93686b56db0db9a2555c70f4636c5ba714fd97e27c4421a28ba7bb524b"} Apr 16 20:16:21.575483 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.575172 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f49013a7-9f20-4940-a6e9-ae5840b19956","Type":"ContainerDied","Data":"1c761ff3c1b5f7abfce0fcdd065db557514948037140ee37f1375602bd628c73"} Apr 16 20:16:21.575483 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.575195 2572 scope.go:117] "RemoveContainer" containerID="0aff502b304621f13b8a0970a733c3746a1d49b7c9cf8632b27669e81f5029ca" Apr 16 20:16:21.575483 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.575199 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.582326 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.582308 2572 scope.go:117] "RemoveContainer" containerID="4da116b212b0264f3d19242a6eb9620907c6891863f95a000836f5b99589ffa6" Apr 16 20:16:21.588960 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.588945 2572 scope.go:117] "RemoveContainer" containerID="56873d93686b56db0db9a2555c70f4636c5ba714fd97e27c4421a28ba7bb524b" Apr 16 20:16:21.595272 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.595241 2572 scope.go:117] "RemoveContainer" containerID="2cb7869d4b515e68bd160f5c25a92059397e7859b67f20a118e16164d755ae87" Apr 16 20:16:21.597498 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.597474 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:16:21.600908 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.600879 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:16:21.602262 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.602241 2572 scope.go:117] "RemoveContainer" containerID="f6111dd3c0d818c4510438ecf0cb68432001179e78d402f8228acee3916aec1c" Apr 16 20:16:21.608574 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.608559 2572 scope.go:117] "RemoveContainer" containerID="464f6392b9b3d1b3a549ce3e5a057d83bee4189c140f03c94d336fca2c82261b" Apr 16 20:16:21.615195 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.615180 2572 scope.go:117] "RemoveContainer" containerID="7878e02afc282d68d77ec78a1383347963727ac12e578faf717248a7099ecb4d" Apr 16 20:16:21.621112 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.621097 2572 scope.go:117] "RemoveContainer" containerID="0aff502b304621f13b8a0970a733c3746a1d49b7c9cf8632b27669e81f5029ca" Apr 16 20:16:21.621347 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:21.621330 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aff502b304621f13b8a0970a733c3746a1d49b7c9cf8632b27669e81f5029ca\": container with ID starting with 0aff502b304621f13b8a0970a733c3746a1d49b7c9cf8632b27669e81f5029ca not found: ID does not exist" containerID="0aff502b304621f13b8a0970a733c3746a1d49b7c9cf8632b27669e81f5029ca" Apr 16 20:16:21.621417 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.621356 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aff502b304621f13b8a0970a733c3746a1d49b7c9cf8632b27669e81f5029ca"} err="failed to get container status \"0aff502b304621f13b8a0970a733c3746a1d49b7c9cf8632b27669e81f5029ca\": rpc error: code = NotFound desc = could not find container \"0aff502b304621f13b8a0970a733c3746a1d49b7c9cf8632b27669e81f5029ca\": container with ID starting with 0aff502b304621f13b8a0970a733c3746a1d49b7c9cf8632b27669e81f5029ca not found: ID does not exist" Apr 16 20:16:21.621417 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.621393 2572 scope.go:117] "RemoveContainer" containerID="4da116b212b0264f3d19242a6eb9620907c6891863f95a000836f5b99589ffa6" Apr 16 20:16:21.621630 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:21.621615 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da116b212b0264f3d19242a6eb9620907c6891863f95a000836f5b99589ffa6\": container with ID starting with 4da116b212b0264f3d19242a6eb9620907c6891863f95a000836f5b99589ffa6 not found: ID does not exist" containerID="4da116b212b0264f3d19242a6eb9620907c6891863f95a000836f5b99589ffa6" Apr 16 20:16:21.621674 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.621632 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da116b212b0264f3d19242a6eb9620907c6891863f95a000836f5b99589ffa6"} err="failed to get container status \"4da116b212b0264f3d19242a6eb9620907c6891863f95a000836f5b99589ffa6\": rpc error: code = NotFound desc = could not find container \"4da116b212b0264f3d19242a6eb9620907c6891863f95a000836f5b99589ffa6\": container with ID starting with 4da116b212b0264f3d19242a6eb9620907c6891863f95a000836f5b99589ffa6 not found: ID does not exist" Apr 16 20:16:21.621674 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.621644 2572 scope.go:117] "RemoveContainer" containerID="56873d93686b56db0db9a2555c70f4636c5ba714fd97e27c4421a28ba7bb524b" Apr 16 20:16:21.621819 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:21.621802 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56873d93686b56db0db9a2555c70f4636c5ba714fd97e27c4421a28ba7bb524b\": container with ID starting with 56873d93686b56db0db9a2555c70f4636c5ba714fd97e27c4421a28ba7bb524b not found: ID does not exist" containerID="56873d93686b56db0db9a2555c70f4636c5ba714fd97e27c4421a28ba7bb524b" Apr 16 20:16:21.621862 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.621820 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56873d93686b56db0db9a2555c70f4636c5ba714fd97e27c4421a28ba7bb524b"} err="failed to get container status \"56873d93686b56db0db9a2555c70f4636c5ba714fd97e27c4421a28ba7bb524b\": rpc error: code = NotFound desc = could not find container \"56873d93686b56db0db9a2555c70f4636c5ba714fd97e27c4421a28ba7bb524b\": container with ID starting with 56873d93686b56db0db9a2555c70f4636c5ba714fd97e27c4421a28ba7bb524b not found: ID does not exist" Apr 16 20:16:21.621862 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.621833 2572 scope.go:117] "RemoveContainer" containerID="2cb7869d4b515e68bd160f5c25a92059397e7859b67f20a118e16164d755ae87" Apr 16 20:16:21.622049 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:21.622015 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb7869d4b515e68bd160f5c25a92059397e7859b67f20a118e16164d755ae87\": container with ID starting with 2cb7869d4b515e68bd160f5c25a92059397e7859b67f20a118e16164d755ae87 not found: ID does not exist" containerID="2cb7869d4b515e68bd160f5c25a92059397e7859b67f20a118e16164d755ae87" Apr 16 20:16:21.622088 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.622055 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb7869d4b515e68bd160f5c25a92059397e7859b67f20a118e16164d755ae87"} err="failed to get container status \"2cb7869d4b515e68bd160f5c25a92059397e7859b67f20a118e16164d755ae87\": rpc error: code = NotFound desc = could not find container \"2cb7869d4b515e68bd160f5c25a92059397e7859b67f20a118e16164d755ae87\": container with ID starting with 2cb7869d4b515e68bd160f5c25a92059397e7859b67f20a118e16164d755ae87 not found: ID does not exist" Apr 16 20:16:21.622088 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.622068 2572 scope.go:117] "RemoveContainer" containerID="f6111dd3c0d818c4510438ecf0cb68432001179e78d402f8228acee3916aec1c" Apr 16 20:16:21.622229 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:21.622216 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6111dd3c0d818c4510438ecf0cb68432001179e78d402f8228acee3916aec1c\": container with ID starting with f6111dd3c0d818c4510438ecf0cb68432001179e78d402f8228acee3916aec1c not found: ID does not exist" containerID="f6111dd3c0d818c4510438ecf0cb68432001179e78d402f8228acee3916aec1c" Apr 16 20:16:21.622267 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.622235 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6111dd3c0d818c4510438ecf0cb68432001179e78d402f8228acee3916aec1c"} err="failed to get container status \"f6111dd3c0d818c4510438ecf0cb68432001179e78d402f8228acee3916aec1c\": rpc error: code = NotFound desc = could not find container \"f6111dd3c0d818c4510438ecf0cb68432001179e78d402f8228acee3916aec1c\": container with ID starting with f6111dd3c0d818c4510438ecf0cb68432001179e78d402f8228acee3916aec1c not found: ID does not exist" Apr 16 20:16:21.622267 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.622249 2572 scope.go:117] "RemoveContainer" containerID="464f6392b9b3d1b3a549ce3e5a057d83bee4189c140f03c94d336fca2c82261b" Apr 16 20:16:21.622517 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:21.622499 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"464f6392b9b3d1b3a549ce3e5a057d83bee4189c140f03c94d336fca2c82261b\": container with ID starting with 464f6392b9b3d1b3a549ce3e5a057d83bee4189c140f03c94d336fca2c82261b not found: ID does not exist" containerID="464f6392b9b3d1b3a549ce3e5a057d83bee4189c140f03c94d336fca2c82261b" Apr 16 20:16:21.622595 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.622521 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"464f6392b9b3d1b3a549ce3e5a057d83bee4189c140f03c94d336fca2c82261b"} err="failed to get container status \"464f6392b9b3d1b3a549ce3e5a057d83bee4189c140f03c94d336fca2c82261b\": rpc error: code = NotFound desc = could not find container \"464f6392b9b3d1b3a549ce3e5a057d83bee4189c140f03c94d336fca2c82261b\": container with ID starting with 464f6392b9b3d1b3a549ce3e5a057d83bee4189c140f03c94d336fca2c82261b not found: ID does not exist" Apr 16 20:16:21.622595 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.622537 2572 scope.go:117] "RemoveContainer" containerID="7878e02afc282d68d77ec78a1383347963727ac12e578faf717248a7099ecb4d" Apr 16 20:16:21.622721 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:16:21.622706 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7878e02afc282d68d77ec78a1383347963727ac12e578faf717248a7099ecb4d\": container with ID starting with 7878e02afc282d68d77ec78a1383347963727ac12e578faf717248a7099ecb4d not found: ID does not exist" containerID="7878e02afc282d68d77ec78a1383347963727ac12e578faf717248a7099ecb4d" Apr 16 20:16:21.622758 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.622723 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7878e02afc282d68d77ec78a1383347963727ac12e578faf717248a7099ecb4d"} err="failed to get container status \"7878e02afc282d68d77ec78a1383347963727ac12e578faf717248a7099ecb4d\": rpc error: code = NotFound desc = could not find container \"7878e02afc282d68d77ec78a1383347963727ac12e578faf717248a7099ecb4d\": container with ID starting with 7878e02afc282d68d77ec78a1383347963727ac12e578faf717248a7099ecb4d not found: ID does not exist" Apr 16 20:16:21.629767 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.629747 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:16:21.630134 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630120 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="kube-rbac-proxy" Apr 16 20:16:21.630186 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630138 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="kube-rbac-proxy" Apr 16 20:16:21.630186 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630159 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="config-reloader" Apr 16 20:16:21.630186 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630168 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="config-reloader" Apr 16 20:16:21.630186 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630182 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="kube-rbac-proxy-thanos" Apr 16 20:16:21.630310 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630191 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="kube-rbac-proxy-thanos" Apr 16 20:16:21.630310 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630203 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="init-config-reloader" Apr 16 20:16:21.630310 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630212 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="init-config-reloader" Apr 16 20:16:21.630310 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630221 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="kube-rbac-proxy-web" Apr 16 20:16:21.630310 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630229 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="kube-rbac-proxy-web" Apr 16 20:16:21.630310 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630246 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="prometheus" Apr 16 20:16:21.630310 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630253 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="prometheus" Apr 16 20:16:21.630310 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630267 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="thanos-sidecar" Apr 16 20:16:21.630310 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630274 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="thanos-sidecar" Apr 16 20:16:21.630678 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630322 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="prometheus" Apr 16 20:16:21.630678 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630333 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="kube-rbac-proxy-web" Apr 16 20:16:21.630678 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630344 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="kube-rbac-proxy" Apr 16 20:16:21.630678 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630354 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="kube-rbac-proxy-thanos" Apr 16 20:16:21.630678 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630364 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="config-reloader" Apr 16 20:16:21.630678 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.630391 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" containerName="thanos-sidecar" Apr 16 20:16:21.633979 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.633964 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.636551 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.636526 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 20:16:21.636647 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.636558 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 20:16:21.636647 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.636526 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 20:16:21.636647 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.636612 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 20:16:21.636835 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.636820 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 20:16:21.636992 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.636977 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 20:16:21.637057 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.636986 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 20:16:21.637057 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.637035 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 20:16:21.637146 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.637057 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bregfb1o3ebal\"" Apr 16 20:16:21.637146 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.637068 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 20:16:21.637240 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.637200 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-z5d9c\"" Apr 16 20:16:21.637563 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.637545 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 20:16:21.637806 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.637790 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 20:16:21.640639 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.640619 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 20:16:21.644959 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.644939 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 20:16:21.651573 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.651555 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:16:21.699712 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.699688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.699849 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.699720 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.699849 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.699739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.699849 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.699766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.699849 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.699789 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28a79339-d1e7-41e6-9650-1c9f4e63fb75-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.699849 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.699803 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.699849 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.699819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.699849 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.699834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.700081 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.699888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.700081 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.699940 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-config\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.700081 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.699963 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28a79339-d1e7-41e6-9650-1c9f4e63fb75-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.700081 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.699983 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.700081 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.700026 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.700081 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.700047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.700081 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.700062 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28a79339-d1e7-41e6-9650-1c9f4e63fb75-config-out\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.700081 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.700079 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-web-config\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.700314 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.700093 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppfhg\" (UniqueName: \"kubernetes.io/projected/28a79339-d1e7-41e6-9650-1c9f4e63fb75-kube-api-access-ppfhg\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.700314 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.700122 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.762627 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.762597 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f49013a7-9f20-4940-a6e9-ae5840b19956" path="/var/lib/kubelet/pods/f49013a7-9f20-4940-a6e9-ae5840b19956/volumes" Apr 16 20:16:21.800631 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.800606 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.800722 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.800639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.800722 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.800664 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28a79339-d1e7-41e6-9650-1c9f4e63fb75-config-out\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.800828 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.800782 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-web-config\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.800828 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.800817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfhg\" (UniqueName: \"kubernetes.io/projected/28a79339-d1e7-41e6-9650-1c9f4e63fb75-kube-api-access-ppfhg\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.800918 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.800856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.800918 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.800885 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.801009 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.800916 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.801634 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.801613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.801774 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.801750 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.801933 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.801779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.802014 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.801933 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.802088 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.802026 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28a79339-d1e7-41e6-9650-1c9f4e63fb75-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.802088 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.802075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.802180 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.802106 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.802180 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.802135 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.802282 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.802180 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.802282 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.802214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-config\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.802282 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.802242 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28a79339-d1e7-41e6-9650-1c9f4e63fb75-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.802492 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.802280 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.802665 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.802557 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.804065 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.803739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28a79339-d1e7-41e6-9650-1c9f4e63fb75-config-out\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.804065 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.803801 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-web-config\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.804209 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.804075 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.804535 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.804511 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.804621 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.804511 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.805045 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.804806 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.805045 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.804989 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.805171 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.805066 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28a79339-d1e7-41e6-9650-1c9f4e63fb75-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.805352 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.805330 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.805758 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.805737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28a79339-d1e7-41e6-9650-1c9f4e63fb75-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.806590 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.806562 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28a79339-d1e7-41e6-9650-1c9f4e63fb75-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.806753 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.806735 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.806935 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.806915 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-config\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.807060 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.807043 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28a79339-d1e7-41e6-9650-1c9f4e63fb75-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.808865 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.808849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppfhg\" (UniqueName: \"kubernetes.io/projected/28a79339-d1e7-41e6-9650-1c9f4e63fb75-kube-api-access-ppfhg\") pod \"prometheus-k8s-0\" (UID: \"28a79339-d1e7-41e6-9650-1c9f4e63fb75\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:21.945231 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:21.945205 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:22.071256 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:22.071225 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:16:22.074095 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:16:22.074064 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a79339_d1e7_41e6_9650_1c9f4e63fb75.slice/crio-439f955afdb8584747796c7119fac39a7e91e46f2f5e9887f6d063e9424d1b79 WatchSource:0}: Error finding container 439f955afdb8584747796c7119fac39a7e91e46f2f5e9887f6d063e9424d1b79: Status 404 returned error can't find the container with id 439f955afdb8584747796c7119fac39a7e91e46f2f5e9887f6d063e9424d1b79 Apr 16 20:16:22.584393 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:22.584341 2572 generic.go:358] "Generic (PLEG): container finished" podID="28a79339-d1e7-41e6-9650-1c9f4e63fb75" containerID="1aaf41a11361589f49ee3695428a22f02b4396936a9afbf7dba11ea7b281407f" exitCode=0 Apr 16 20:16:22.584785 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:22.584430 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28a79339-d1e7-41e6-9650-1c9f4e63fb75","Type":"ContainerDied","Data":"1aaf41a11361589f49ee3695428a22f02b4396936a9afbf7dba11ea7b281407f"} Apr 16 20:16:22.584785 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:22.584461 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28a79339-d1e7-41e6-9650-1c9f4e63fb75","Type":"ContainerStarted","Data":"439f955afdb8584747796c7119fac39a7e91e46f2f5e9887f6d063e9424d1b79"} Apr 16 20:16:23.591203 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:23.591167 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28a79339-d1e7-41e6-9650-1c9f4e63fb75","Type":"ContainerStarted","Data":"1ec01f912c8a904c82eb803c8cb40fe28e7afadcf865faa6ca5d63450cafd94d"} Apr 16 20:16:23.591203 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:23.591201 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28a79339-d1e7-41e6-9650-1c9f4e63fb75","Type":"ContainerStarted","Data":"905cee22a1567df2d5bb3bba4082f1b255f45cb1341cfd586afd0b0fc9201d7a"} Apr 16 20:16:23.591203 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:23.591211 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28a79339-d1e7-41e6-9650-1c9f4e63fb75","Type":"ContainerStarted","Data":"a27db4f7ce9ea9e29a36fb886b2515e69a171ce7b61322f372f79bd48d458185"} Apr 16 20:16:23.591651 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:23.591220 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28a79339-d1e7-41e6-9650-1c9f4e63fb75","Type":"ContainerStarted","Data":"17ba6087218507a3dc4f302eb12c445dd6a3ba6632dbf6e7d08486ea8b84015f"} Apr 16 20:16:23.591651 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:23.591228 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28a79339-d1e7-41e6-9650-1c9f4e63fb75","Type":"ContainerStarted","Data":"720808061dec6df983114d3085669184668c4d97e4a5773be01af5f5a9291aae"} Apr 16 20:16:23.591651 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:23.591237 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28a79339-d1e7-41e6-9650-1c9f4e63fb75","Type":"ContainerStarted","Data":"86d3c647a6be5256444b61ef423373657a8c38a1e24191896c7f0e3cf8e827a1"} Apr 16 20:16:23.617078 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:23.617034 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.617018356 podStartE2EDuration="2.617018356s" podCreationTimestamp="2026-04-16 20:16:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:16:23.615820365 +0000 UTC m=+266.456720755" watchObservedRunningTime="2026-04-16 20:16:23.617018356 +0000 UTC m=+266.457918746" Apr 16 20:16:26.945538 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:26.945485 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:46.256157 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.256120 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-66fvr"] Apr 16 20:16:46.260042 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.260010 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66fvr" Apr 16 20:16:46.263704 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.263678 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:16:46.268760 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.268734 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-66fvr"] Apr 16 20:16:46.406226 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.406197 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fd643302-fe55-4675-ba96-c6a539df4ac8-dbus\") pod \"global-pull-secret-syncer-66fvr\" (UID: \"fd643302-fe55-4675-ba96-c6a539df4ac8\") " pod="kube-system/global-pull-secret-syncer-66fvr" Apr 16 20:16:46.406414 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.406286 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fd643302-fe55-4675-ba96-c6a539df4ac8-kubelet-config\") pod \"global-pull-secret-syncer-66fvr\" (UID: \"fd643302-fe55-4675-ba96-c6a539df4ac8\") " pod="kube-system/global-pull-secret-syncer-66fvr" Apr 16 20:16:46.406414 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.406309 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd643302-fe55-4675-ba96-c6a539df4ac8-original-pull-secret\") pod \"global-pull-secret-syncer-66fvr\" (UID: \"fd643302-fe55-4675-ba96-c6a539df4ac8\") " pod="kube-system/global-pull-secret-syncer-66fvr" Apr 16 20:16:46.507268 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.507184 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fd643302-fe55-4675-ba96-c6a539df4ac8-kubelet-config\") pod \"global-pull-secret-syncer-66fvr\" (UID: \"fd643302-fe55-4675-ba96-c6a539df4ac8\") " pod="kube-system/global-pull-secret-syncer-66fvr" Apr 16 20:16:46.507268 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.507219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd643302-fe55-4675-ba96-c6a539df4ac8-original-pull-secret\") pod \"global-pull-secret-syncer-66fvr\" (UID: \"fd643302-fe55-4675-ba96-c6a539df4ac8\") " pod="kube-system/global-pull-secret-syncer-66fvr" Apr 16 20:16:46.507462 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.507268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fd643302-fe55-4675-ba96-c6a539df4ac8-dbus\") pod \"global-pull-secret-syncer-66fvr\" (UID: \"fd643302-fe55-4675-ba96-c6a539df4ac8\") " pod="kube-system/global-pull-secret-syncer-66fvr" Apr 16 20:16:46.507462 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.507314 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fd643302-fe55-4675-ba96-c6a539df4ac8-kubelet-config\") pod \"global-pull-secret-syncer-66fvr\" (UID: \"fd643302-fe55-4675-ba96-c6a539df4ac8\") " pod="kube-system/global-pull-secret-syncer-66fvr" Apr 16 20:16:46.507462 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.507429 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fd643302-fe55-4675-ba96-c6a539df4ac8-dbus\") pod \"global-pull-secret-syncer-66fvr\" (UID: \"fd643302-fe55-4675-ba96-c6a539df4ac8\") " pod="kube-system/global-pull-secret-syncer-66fvr" Apr 16 20:16:46.509660 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.509641 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd643302-fe55-4675-ba96-c6a539df4ac8-original-pull-secret\") pod \"global-pull-secret-syncer-66fvr\" (UID: \"fd643302-fe55-4675-ba96-c6a539df4ac8\") " pod="kube-system/global-pull-secret-syncer-66fvr" Apr 16 20:16:46.570835 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.570805 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66fvr" Apr 16 20:16:46.688559 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:46.688525 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-66fvr"] Apr 16 20:16:46.691355 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:16:46.691326 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd643302_fe55_4675_ba96_c6a539df4ac8.slice/crio-dc648977c2f1a2342ec407dfa515868c836bdf13880039f23f628a673d446831 WatchSource:0}: Error finding container dc648977c2f1a2342ec407dfa515868c836bdf13880039f23f628a673d446831: Status 404 returned error can't find the container with id dc648977c2f1a2342ec407dfa515868c836bdf13880039f23f628a673d446831 Apr 16 20:16:47.667091 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:47.665331 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-66fvr" event={"ID":"fd643302-fe55-4675-ba96-c6a539df4ac8","Type":"ContainerStarted","Data":"dc648977c2f1a2342ec407dfa515868c836bdf13880039f23f628a673d446831"} Apr 16 20:16:50.681203 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:50.681110 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-66fvr" event={"ID":"fd643302-fe55-4675-ba96-c6a539df4ac8","Type":"ContainerStarted","Data":"e5da80b9fab101058ab9e682b70862fb9a3d7fe738807f8c4911c702f9bb6106"} Apr 16 20:16:50.697509 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:50.697468 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-66fvr" podStartSLOduration=1.143795197 podStartE2EDuration="4.697455751s" podCreationTimestamp="2026-04-16 20:16:46 +0000 UTC" firstStartedPulling="2026-04-16 20:16:46.69336728 +0000 UTC m=+289.534267649" lastFinishedPulling="2026-04-16 20:16:50.247027829 +0000 UTC m=+293.087928203" observedRunningTime="2026-04-16 20:16:50.696480209 +0000 UTC m=+293.537380602" watchObservedRunningTime="2026-04-16 20:16:50.697455751 +0000 UTC m=+293.538356141" Apr 16 20:16:57.644606 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:57.644576 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:16:57.644971 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:57.644900 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:16:57.653242 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:16:57.653222 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:17:21.945346 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:17:21.945303 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:17:21.961184 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:17:21.961154 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:17:22.785712 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:17:22.785686 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:19:54.672920 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.672892 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-pwdsz"] Apr 16 20:19:54.676098 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.676070 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-pwdsz" Apr 16 20:19:54.678437 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.678412 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 20:19:54.678582 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.678438 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 20:19:54.678582 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.678569 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 20:19:54.679286 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.679267 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-9jx84\"" Apr 16 20:19:54.685941 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.685919 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-pwdsz"] Apr 16 20:19:54.799678 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.799644 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfg8z\" (UniqueName: \"kubernetes.io/projected/684483e5-70e3-4c78-96b1-9f89739632c1-kube-api-access-hfg8z\") pod \"odh-model-controller-696fc77849-pwdsz\" (UID: \"684483e5-70e3-4c78-96b1-9f89739632c1\") " pod="kserve/odh-model-controller-696fc77849-pwdsz" Apr 16 20:19:54.799678 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.799693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/684483e5-70e3-4c78-96b1-9f89739632c1-cert\") pod \"odh-model-controller-696fc77849-pwdsz\" (UID: \"684483e5-70e3-4c78-96b1-9f89739632c1\") " pod="kserve/odh-model-controller-696fc77849-pwdsz" Apr 16 20:19:54.900482 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.900447 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfg8z\" (UniqueName: \"kubernetes.io/projected/684483e5-70e3-4c78-96b1-9f89739632c1-kube-api-access-hfg8z\") pod \"odh-model-controller-696fc77849-pwdsz\" (UID: \"684483e5-70e3-4c78-96b1-9f89739632c1\") " pod="kserve/odh-model-controller-696fc77849-pwdsz" Apr 16 20:19:54.900640 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.900497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/684483e5-70e3-4c78-96b1-9f89739632c1-cert\") pod \"odh-model-controller-696fc77849-pwdsz\" (UID: \"684483e5-70e3-4c78-96b1-9f89739632c1\") " pod="kserve/odh-model-controller-696fc77849-pwdsz" Apr 16 20:19:54.902947 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.902923 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/684483e5-70e3-4c78-96b1-9f89739632c1-cert\") pod \"odh-model-controller-696fc77849-pwdsz\" (UID: \"684483e5-70e3-4c78-96b1-9f89739632c1\") " pod="kserve/odh-model-controller-696fc77849-pwdsz" Apr 16 20:19:54.911062 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.911041 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfg8z\" (UniqueName: \"kubernetes.io/projected/684483e5-70e3-4c78-96b1-9f89739632c1-kube-api-access-hfg8z\") pod \"odh-model-controller-696fc77849-pwdsz\" (UID: \"684483e5-70e3-4c78-96b1-9f89739632c1\") " pod="kserve/odh-model-controller-696fc77849-pwdsz" Apr 16 20:19:54.987883 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:54.987859 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-pwdsz" Apr 16 20:19:55.108106 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:55.108080 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-pwdsz"] Apr 16 20:19:55.110693 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:19:55.110664 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod684483e5_70e3_4c78_96b1_9f89739632c1.slice/crio-44548bd1a5a63574d5114095f3ef1e9aa0f16e86ac98808afbede616ef2d0c8c WatchSource:0}: Error finding container 44548bd1a5a63574d5114095f3ef1e9aa0f16e86ac98808afbede616ef2d0c8c: Status 404 returned error can't find the container with id 44548bd1a5a63574d5114095f3ef1e9aa0f16e86ac98808afbede616ef2d0c8c Apr 16 20:19:55.111784 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:55.111767 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:19:55.171010 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:55.170977 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-pwdsz" event={"ID":"684483e5-70e3-4c78-96b1-9f89739632c1","Type":"ContainerStarted","Data":"44548bd1a5a63574d5114095f3ef1e9aa0f16e86ac98808afbede616ef2d0c8c"} Apr 16 20:19:58.181610 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:58.181567 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-pwdsz" event={"ID":"684483e5-70e3-4c78-96b1-9f89739632c1","Type":"ContainerStarted","Data":"54cb2382c6e32014a08690c3a0ff5827e78dce37115a6d04d141cbef47969e55"} Apr 16 20:19:58.182055 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:58.181741 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-pwdsz" Apr 16 20:19:58.199990 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:19:58.199942 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-pwdsz" podStartSLOduration=1.924227989 podStartE2EDuration="4.199927992s" podCreationTimestamp="2026-04-16 20:19:54 +0000 UTC" firstStartedPulling="2026-04-16 20:19:55.111908058 +0000 UTC m=+477.952808426" lastFinishedPulling="2026-04-16 20:19:57.387608061 +0000 UTC m=+480.228508429" observedRunningTime="2026-04-16 20:19:58.198428383 +0000 UTC m=+481.039328774" watchObservedRunningTime="2026-04-16 20:19:58.199927992 +0000 UTC m=+481.040828381" Apr 16 20:20:09.187270 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:09.187238 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-pwdsz" Apr 16 20:20:10.011085 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:10.011052 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-b5bbm"] Apr 16 20:20:10.014424 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:10.014406 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-b5bbm" Apr 16 20:20:10.016806 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:10.016785 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 20:20:10.016930 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:10.016880 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-wl6nb\"" Apr 16 20:20:10.019208 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:10.019183 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl759\" (UniqueName: \"kubernetes.io/projected/8b942c2d-55d1-4457-bbe9-1da776c464d5-kube-api-access-rl759\") pod \"s3-init-b5bbm\" (UID: \"8b942c2d-55d1-4457-bbe9-1da776c464d5\") " pod="kserve/s3-init-b5bbm" Apr 16 20:20:10.020437 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:10.020412 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-b5bbm"] Apr 16 20:20:10.120305 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:10.120267 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rl759\" (UniqueName: \"kubernetes.io/projected/8b942c2d-55d1-4457-bbe9-1da776c464d5-kube-api-access-rl759\") pod \"s3-init-b5bbm\" (UID: \"8b942c2d-55d1-4457-bbe9-1da776c464d5\") " pod="kserve/s3-init-b5bbm" Apr 16 20:20:10.128882 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:10.128837 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl759\" (UniqueName: \"kubernetes.io/projected/8b942c2d-55d1-4457-bbe9-1da776c464d5-kube-api-access-rl759\") pod \"s3-init-b5bbm\" (UID: \"8b942c2d-55d1-4457-bbe9-1da776c464d5\") " pod="kserve/s3-init-b5bbm" Apr 16 20:20:10.338562 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:10.338476 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-b5bbm" Apr 16 20:20:10.463157 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:10.463134 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-b5bbm"] Apr 16 20:20:10.465423 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:20:10.465393 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b942c2d_55d1_4457_bbe9_1da776c464d5.slice/crio-e46e6c6cbc7b3861afa225ba442b82b40220ac30910a10298d283a4c297c6d75 WatchSource:0}: Error finding container e46e6c6cbc7b3861afa225ba442b82b40220ac30910a10298d283a4c297c6d75: Status 404 returned error can't find the container with id e46e6c6cbc7b3861afa225ba442b82b40220ac30910a10298d283a4c297c6d75 Apr 16 20:20:11.225405 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:11.225349 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-b5bbm" event={"ID":"8b942c2d-55d1-4457-bbe9-1da776c464d5","Type":"ContainerStarted","Data":"e46e6c6cbc7b3861afa225ba442b82b40220ac30910a10298d283a4c297c6d75"} Apr 16 20:20:15.241813 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:15.241772 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-b5bbm" event={"ID":"8b942c2d-55d1-4457-bbe9-1da776c464d5","Type":"ContainerStarted","Data":"308c29ea7242e4e5bc4a4b58a3a8952b501e7a9d39979d032bf984d05e2bf765"} Apr 16 20:20:15.259566 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:15.259515 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-b5bbm" podStartSLOduration=1.795663349 podStartE2EDuration="6.259502306s" podCreationTimestamp="2026-04-16 20:20:09 +0000 UTC" firstStartedPulling="2026-04-16 20:20:10.467188911 +0000 UTC m=+493.308089286" lastFinishedPulling="2026-04-16 20:20:14.931027861 +0000 UTC m=+497.771928243" observedRunningTime="2026-04-16 20:20:15.258478536 +0000 UTC m=+498.099378926" watchObservedRunningTime="2026-04-16 20:20:15.259502306 +0000 UTC m=+498.100402695" Apr 16 20:20:18.252777 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:18.252687 2572 generic.go:358] "Generic (PLEG): container finished" podID="8b942c2d-55d1-4457-bbe9-1da776c464d5" containerID="308c29ea7242e4e5bc4a4b58a3a8952b501e7a9d39979d032bf984d05e2bf765" exitCode=0 Apr 16 20:20:18.252777 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:18.252738 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-b5bbm" event={"ID":"8b942c2d-55d1-4457-bbe9-1da776c464d5","Type":"ContainerDied","Data":"308c29ea7242e4e5bc4a4b58a3a8952b501e7a9d39979d032bf984d05e2bf765"} Apr 16 20:20:19.376416 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:19.376369 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-b5bbm" Apr 16 20:20:19.389744 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:19.389721 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl759\" (UniqueName: \"kubernetes.io/projected/8b942c2d-55d1-4457-bbe9-1da776c464d5-kube-api-access-rl759\") pod \"8b942c2d-55d1-4457-bbe9-1da776c464d5\" (UID: \"8b942c2d-55d1-4457-bbe9-1da776c464d5\") " Apr 16 20:20:19.392028 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:19.391994 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b942c2d-55d1-4457-bbe9-1da776c464d5-kube-api-access-rl759" (OuterVolumeSpecName: "kube-api-access-rl759") pod "8b942c2d-55d1-4457-bbe9-1da776c464d5" (UID: "8b942c2d-55d1-4457-bbe9-1da776c464d5"). InnerVolumeSpecName "kube-api-access-rl759". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:20:19.490502 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:19.490450 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rl759\" (UniqueName: \"kubernetes.io/projected/8b942c2d-55d1-4457-bbe9-1da776c464d5-kube-api-access-rl759\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:20:20.259167 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:20.259139 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-b5bbm" Apr 16 20:20:20.259167 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:20.259147 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-b5bbm" event={"ID":"8b942c2d-55d1-4457-bbe9-1da776c464d5","Type":"ContainerDied","Data":"e46e6c6cbc7b3861afa225ba442b82b40220ac30910a10298d283a4c297c6d75"} Apr 16 20:20:20.259167 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:20.259175 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e46e6c6cbc7b3861afa225ba442b82b40220ac30910a10298d283a4c297c6d75" Apr 16 20:20:21.011557 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.011520 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl"] Apr 16 20:20:21.011883 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.011815 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b942c2d-55d1-4457-bbe9-1da776c464d5" containerName="s3-init" Apr 16 20:20:21.011883 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.011825 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b942c2d-55d1-4457-bbe9-1da776c464d5" containerName="s3-init" Apr 16 20:20:21.011883 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.011869 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b942c2d-55d1-4457-bbe9-1da776c464d5" containerName="s3-init" Apr 16 20:20:21.014606 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.014588 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" Apr 16 20:20:21.017233 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.017219 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-wl6nb\"" Apr 16 20:20:21.017299 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.017220 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 20:20:21.022486 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.022458 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl"] Apr 16 20:20:21.105453 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.105415 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/bbf78a35-8796-4d5a-8178-b95dfe345d83-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-cp5gl\" (UID: \"bbf78a35-8796-4d5a-8178-b95dfe345d83\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" Apr 16 20:20:21.105453 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.105457 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtwmn\" (UniqueName: \"kubernetes.io/projected/bbf78a35-8796-4d5a-8178-b95dfe345d83-kube-api-access-qtwmn\") pod \"seaweedfs-tls-custom-ddd4dbfd-cp5gl\" (UID: \"bbf78a35-8796-4d5a-8178-b95dfe345d83\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" Apr 16 20:20:21.206343 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.206297 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/bbf78a35-8796-4d5a-8178-b95dfe345d83-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-cp5gl\" (UID: \"bbf78a35-8796-4d5a-8178-b95dfe345d83\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" Apr 16 20:20:21.206343 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.206348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtwmn\" (UniqueName: \"kubernetes.io/projected/bbf78a35-8796-4d5a-8178-b95dfe345d83-kube-api-access-qtwmn\") pod \"seaweedfs-tls-custom-ddd4dbfd-cp5gl\" (UID: \"bbf78a35-8796-4d5a-8178-b95dfe345d83\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" Apr 16 20:20:21.206717 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.206695 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/bbf78a35-8796-4d5a-8178-b95dfe345d83-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-cp5gl\" (UID: \"bbf78a35-8796-4d5a-8178-b95dfe345d83\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" Apr 16 20:20:21.214955 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.214925 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtwmn\" (UniqueName: \"kubernetes.io/projected/bbf78a35-8796-4d5a-8178-b95dfe345d83-kube-api-access-qtwmn\") pod \"seaweedfs-tls-custom-ddd4dbfd-cp5gl\" (UID: \"bbf78a35-8796-4d5a-8178-b95dfe345d83\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" Apr 16 20:20:21.324046 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.323954 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" Apr 16 20:20:21.444051 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:21.444016 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl"] Apr 16 20:20:21.447023 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:20:21.446994 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbf78a35_8796_4d5a_8178_b95dfe345d83.slice/crio-fd4c72fc5f57afb3fb6b9cfac0e71736d9a7d6eec3e5a5a614f0b28f121cd13a WatchSource:0}: Error finding container fd4c72fc5f57afb3fb6b9cfac0e71736d9a7d6eec3e5a5a614f0b28f121cd13a: Status 404 returned error can't find the container with id fd4c72fc5f57afb3fb6b9cfac0e71736d9a7d6eec3e5a5a614f0b28f121cd13a Apr 16 20:20:22.266481 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:22.266448 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" event={"ID":"bbf78a35-8796-4d5a-8178-b95dfe345d83","Type":"ContainerStarted","Data":"fd4c72fc5f57afb3fb6b9cfac0e71736d9a7d6eec3e5a5a614f0b28f121cd13a"} Apr 16 20:20:24.273689 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:24.273651 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" event={"ID":"bbf78a35-8796-4d5a-8178-b95dfe345d83","Type":"ContainerStarted","Data":"6ca1fd4967ebe5e1ee00f313c94e63431fda2619a4a416df8d395cbfd868e1c1"} Apr 16 20:20:24.290157 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:24.290096 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" podStartSLOduration=1.8500114189999999 podStartE2EDuration="4.290081921s" podCreationTimestamp="2026-04-16 20:20:20 +0000 UTC" firstStartedPulling="2026-04-16 20:20:21.448220789 +0000 UTC m=+504.289121157" lastFinishedPulling="2026-04-16 20:20:23.888291277 +0000 UTC m=+506.729191659" observedRunningTime="2026-04-16 20:20:24.289396701 +0000 UTC m=+507.130297083" watchObservedRunningTime="2026-04-16 20:20:24.290081921 +0000 UTC m=+507.130982310" Apr 16 20:20:26.016060 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:26.016025 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl"] Apr 16 20:20:26.279489 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:26.279338 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" podUID="bbf78a35-8796-4d5a-8178-b95dfe345d83" containerName="seaweedfs-tls-custom" containerID="cri-o://6ca1fd4967ebe5e1ee00f313c94e63431fda2619a4a416df8d395cbfd868e1c1" gracePeriod=30 Apr 16 20:20:27.506430 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:27.506408 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" Apr 16 20:20:27.558862 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:27.558781 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtwmn\" (UniqueName: \"kubernetes.io/projected/bbf78a35-8796-4d5a-8178-b95dfe345d83-kube-api-access-qtwmn\") pod \"bbf78a35-8796-4d5a-8178-b95dfe345d83\" (UID: \"bbf78a35-8796-4d5a-8178-b95dfe345d83\") " Apr 16 20:20:27.558999 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:27.558871 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/bbf78a35-8796-4d5a-8178-b95dfe345d83-data\") pod \"bbf78a35-8796-4d5a-8178-b95dfe345d83\" (UID: \"bbf78a35-8796-4d5a-8178-b95dfe345d83\") " Apr 16 20:20:27.560168 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:27.560145 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf78a35-8796-4d5a-8178-b95dfe345d83-data" (OuterVolumeSpecName: "data") pod "bbf78a35-8796-4d5a-8178-b95dfe345d83" (UID: "bbf78a35-8796-4d5a-8178-b95dfe345d83"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:20:27.560905 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:27.560880 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf78a35-8796-4d5a-8178-b95dfe345d83-kube-api-access-qtwmn" (OuterVolumeSpecName: "kube-api-access-qtwmn") pod "bbf78a35-8796-4d5a-8178-b95dfe345d83" (UID: "bbf78a35-8796-4d5a-8178-b95dfe345d83"). InnerVolumeSpecName "kube-api-access-qtwmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:20:27.659613 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:27.659579 2572 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/bbf78a35-8796-4d5a-8178-b95dfe345d83-data\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:20:27.659613 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:27.659608 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qtwmn\" (UniqueName: \"kubernetes.io/projected/bbf78a35-8796-4d5a-8178-b95dfe345d83-kube-api-access-qtwmn\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:20:28.285661 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:28.285629 2572 generic.go:358] "Generic (PLEG): container finished" podID="bbf78a35-8796-4d5a-8178-b95dfe345d83" containerID="6ca1fd4967ebe5e1ee00f313c94e63431fda2619a4a416df8d395cbfd868e1c1" exitCode=0 Apr 16 20:20:28.285832 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:28.285670 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" event={"ID":"bbf78a35-8796-4d5a-8178-b95dfe345d83","Type":"ContainerDied","Data":"6ca1fd4967ebe5e1ee00f313c94e63431fda2619a4a416df8d395cbfd868e1c1"} Apr 16 20:20:28.285832 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:28.285685 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" Apr 16 20:20:28.285832 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:28.285701 2572 scope.go:117] "RemoveContainer" containerID="6ca1fd4967ebe5e1ee00f313c94e63431fda2619a4a416df8d395cbfd868e1c1" Apr 16 20:20:28.285832 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:28.285691 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl" event={"ID":"bbf78a35-8796-4d5a-8178-b95dfe345d83","Type":"ContainerDied","Data":"fd4c72fc5f57afb3fb6b9cfac0e71736d9a7d6eec3e5a5a614f0b28f121cd13a"} Apr 16 20:20:28.294483 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:28.294459 2572 scope.go:117] "RemoveContainer" containerID="6ca1fd4967ebe5e1ee00f313c94e63431fda2619a4a416df8d395cbfd868e1c1" Apr 16 20:20:28.294735 ip-10-0-137-142 kubenswrapper[2572]: E0416 20:20:28.294715 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ca1fd4967ebe5e1ee00f313c94e63431fda2619a4a416df8d395cbfd868e1c1\": container with ID starting with 6ca1fd4967ebe5e1ee00f313c94e63431fda2619a4a416df8d395cbfd868e1c1 not found: ID does not exist" containerID="6ca1fd4967ebe5e1ee00f313c94e63431fda2619a4a416df8d395cbfd868e1c1" Apr 16 20:20:28.294806 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:28.294747 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ca1fd4967ebe5e1ee00f313c94e63431fda2619a4a416df8d395cbfd868e1c1"} err="failed to get container status \"6ca1fd4967ebe5e1ee00f313c94e63431fda2619a4a416df8d395cbfd868e1c1\": rpc error: code = NotFound desc = could not find container \"6ca1fd4967ebe5e1ee00f313c94e63431fda2619a4a416df8d395cbfd868e1c1\": container with ID starting with 6ca1fd4967ebe5e1ee00f313c94e63431fda2619a4a416df8d395cbfd868e1c1 not found: ID does not exist" Apr 16 20:20:28.302145 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:28.302119 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl"] Apr 16 20:20:28.306104 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:28.306083 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-cp5gl"] Apr 16 20:20:29.762658 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:29.762623 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf78a35-8796-4d5a-8178-b95dfe345d83" path="/var/lib/kubelet/pods/bbf78a35-8796-4d5a-8178-b95dfe345d83/volumes" Apr 16 20:20:32.187254 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:32.187218 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-w6pl6"] Apr 16 20:20:32.187640 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:32.187519 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbf78a35-8796-4d5a-8178-b95dfe345d83" containerName="seaweedfs-tls-custom" Apr 16 20:20:32.187640 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:32.187530 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf78a35-8796-4d5a-8178-b95dfe345d83" containerName="seaweedfs-tls-custom" Apr 16 20:20:32.187640 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:32.187600 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbf78a35-8796-4d5a-8178-b95dfe345d83" containerName="seaweedfs-tls-custom" Apr 16 20:20:32.192130 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:32.192111 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-w6pl6" Apr 16 20:20:32.194599 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:32.194576 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 20:20:32.194982 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:32.194961 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-wl6nb\"" Apr 16 20:20:32.195093 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:32.195054 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-w6pl6"] Apr 16 20:20:32.292031 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:32.291987 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx6w2\" (UniqueName: \"kubernetes.io/projected/8da9ec91-abb1-4b12-9a66-cf545376deec-kube-api-access-jx6w2\") pod \"s3-tls-init-custom-w6pl6\" (UID: \"8da9ec91-abb1-4b12-9a66-cf545376deec\") " pod="kserve/s3-tls-init-custom-w6pl6" Apr 16 20:20:32.392995 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:32.392958 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jx6w2\" (UniqueName: \"kubernetes.io/projected/8da9ec91-abb1-4b12-9a66-cf545376deec-kube-api-access-jx6w2\") pod \"s3-tls-init-custom-w6pl6\" (UID: \"8da9ec91-abb1-4b12-9a66-cf545376deec\") " pod="kserve/s3-tls-init-custom-w6pl6" Apr 16 20:20:32.400760 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:32.400735 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx6w2\" (UniqueName: \"kubernetes.io/projected/8da9ec91-abb1-4b12-9a66-cf545376deec-kube-api-access-jx6w2\") pod \"s3-tls-init-custom-w6pl6\" (UID: \"8da9ec91-abb1-4b12-9a66-cf545376deec\") " pod="kserve/s3-tls-init-custom-w6pl6" Apr 16 20:20:32.513194 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:32.513154 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-w6pl6" Apr 16 20:20:32.641832 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:32.641801 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-w6pl6"] Apr 16 20:20:32.644455 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:20:32.644426 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8da9ec91_abb1_4b12_9a66_cf545376deec.slice/crio-5ae17d5973f50d382c825947280daaa927fba064d12e82fe64766039d9dec84e WatchSource:0}: Error finding container 5ae17d5973f50d382c825947280daaa927fba064d12e82fe64766039d9dec84e: Status 404 returned error can't find the container with id 5ae17d5973f50d382c825947280daaa927fba064d12e82fe64766039d9dec84e Apr 16 20:20:33.301449 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:33.301414 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-w6pl6" event={"ID":"8da9ec91-abb1-4b12-9a66-cf545376deec","Type":"ContainerStarted","Data":"070dc56f2ff3a7810eb88fe5f3510b390a7c0af475a6f374cee6cc12c5fb19af"} Apr 16 20:20:33.301449 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:33.301450 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-w6pl6" event={"ID":"8da9ec91-abb1-4b12-9a66-cf545376deec","Type":"ContainerStarted","Data":"5ae17d5973f50d382c825947280daaa927fba064d12e82fe64766039d9dec84e"} Apr 16 20:20:33.318061 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:33.318004 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-w6pl6" podStartSLOduration=1.317990731 podStartE2EDuration="1.317990731s" podCreationTimestamp="2026-04-16 20:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:20:33.316568653 +0000 UTC m=+516.157469043" watchObservedRunningTime="2026-04-16 20:20:33.317990731 +0000 UTC m=+516.158891121" Apr 16 20:20:38.316624 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:38.316590 2572 generic.go:358] "Generic (PLEG): container finished" podID="8da9ec91-abb1-4b12-9a66-cf545376deec" containerID="070dc56f2ff3a7810eb88fe5f3510b390a7c0af475a6f374cee6cc12c5fb19af" exitCode=0 Apr 16 20:20:38.316624 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:38.316630 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-w6pl6" event={"ID":"8da9ec91-abb1-4b12-9a66-cf545376deec","Type":"ContainerDied","Data":"070dc56f2ff3a7810eb88fe5f3510b390a7c0af475a6f374cee6cc12c5fb19af"} Apr 16 20:20:39.436284 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:39.436262 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-w6pl6" Apr 16 20:20:39.550582 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:39.550552 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx6w2\" (UniqueName: \"kubernetes.io/projected/8da9ec91-abb1-4b12-9a66-cf545376deec-kube-api-access-jx6w2\") pod \"8da9ec91-abb1-4b12-9a66-cf545376deec\" (UID: \"8da9ec91-abb1-4b12-9a66-cf545376deec\") " Apr 16 20:20:39.552674 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:39.552651 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da9ec91-abb1-4b12-9a66-cf545376deec-kube-api-access-jx6w2" (OuterVolumeSpecName: "kube-api-access-jx6w2") pod "8da9ec91-abb1-4b12-9a66-cf545376deec" (UID: "8da9ec91-abb1-4b12-9a66-cf545376deec"). InnerVolumeSpecName "kube-api-access-jx6w2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:20:39.651483 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:39.651409 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jx6w2\" (UniqueName: \"kubernetes.io/projected/8da9ec91-abb1-4b12-9a66-cf545376deec-kube-api-access-jx6w2\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:20:40.323792 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:40.323759 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-w6pl6" event={"ID":"8da9ec91-abb1-4b12-9a66-cf545376deec","Type":"ContainerDied","Data":"5ae17d5973f50d382c825947280daaa927fba064d12e82fe64766039d9dec84e"} Apr 16 20:20:40.323792 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:40.323793 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ae17d5973f50d382c825947280daaa927fba064d12e82fe64766039d9dec84e" Apr 16 20:20:40.324074 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:40.323800 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-w6pl6" Apr 16 20:20:42.476743 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:42.476703 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-c9pnp"] Apr 16 20:20:42.477188 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:42.477131 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8da9ec91-abb1-4b12-9a66-cf545376deec" containerName="s3-tls-init-custom" Apr 16 20:20:42.477188 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:42.477150 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da9ec91-abb1-4b12-9a66-cf545376deec" containerName="s3-tls-init-custom" Apr 16 20:20:42.477293 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:42.477221 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8da9ec91-abb1-4b12-9a66-cf545376deec" containerName="s3-tls-init-custom" Apr 16 20:20:42.481456 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:42.481435 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-c9pnp" Apr 16 20:20:42.484082 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:42.483850 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 20:20:42.484623 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:42.484602 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-wl6nb\"" Apr 16 20:20:42.485441 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:42.485419 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-c9pnp"] Apr 16 20:20:42.575708 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:42.575665 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr5s2\" (UniqueName: \"kubernetes.io/projected/47d5476e-8e4f-4ae8-837c-38e7674327f6-kube-api-access-tr5s2\") pod \"s3-tls-init-serving-c9pnp\" (UID: \"47d5476e-8e4f-4ae8-837c-38e7674327f6\") " pod="kserve/s3-tls-init-serving-c9pnp" Apr 16 20:20:42.676052 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:42.676008 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tr5s2\" (UniqueName: \"kubernetes.io/projected/47d5476e-8e4f-4ae8-837c-38e7674327f6-kube-api-access-tr5s2\") pod \"s3-tls-init-serving-c9pnp\" (UID: \"47d5476e-8e4f-4ae8-837c-38e7674327f6\") " pod="kserve/s3-tls-init-serving-c9pnp" Apr 16 20:20:42.684266 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:42.684242 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr5s2\" (UniqueName: \"kubernetes.io/projected/47d5476e-8e4f-4ae8-837c-38e7674327f6-kube-api-access-tr5s2\") pod \"s3-tls-init-serving-c9pnp\" (UID: \"47d5476e-8e4f-4ae8-837c-38e7674327f6\") " pod="kserve/s3-tls-init-serving-c9pnp" Apr 16 20:20:42.800664 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:42.800592 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-c9pnp" Apr 16 20:20:42.918564 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:42.918536 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-c9pnp"] Apr 16 20:20:42.920866 ip-10-0-137-142 kubenswrapper[2572]: W0416 20:20:42.920840 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47d5476e_8e4f_4ae8_837c_38e7674327f6.slice/crio-e00baf00815ec1f499abc37e9523c03305b01ada2b065f8bb23116d4c5d5bfb4 WatchSource:0}: Error finding container e00baf00815ec1f499abc37e9523c03305b01ada2b065f8bb23116d4c5d5bfb4: Status 404 returned error can't find the container with id e00baf00815ec1f499abc37e9523c03305b01ada2b065f8bb23116d4c5d5bfb4 Apr 16 20:20:43.333488 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:43.333455 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-c9pnp" event={"ID":"47d5476e-8e4f-4ae8-837c-38e7674327f6","Type":"ContainerStarted","Data":"5bbb568a0e60cbbe71338f72e946d6c326b2ff6c61ec330758fae2d374858cdf"} Apr 16 20:20:43.333488 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:43.333494 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-c9pnp" event={"ID":"47d5476e-8e4f-4ae8-837c-38e7674327f6","Type":"ContainerStarted","Data":"e00baf00815ec1f499abc37e9523c03305b01ada2b065f8bb23116d4c5d5bfb4"} Apr 16 20:20:43.349203 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:43.349151 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-c9pnp" podStartSLOduration=1.3491388469999999 podStartE2EDuration="1.349138847s" podCreationTimestamp="2026-04-16 20:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:20:43.347177088 +0000 UTC m=+526.188077479" watchObservedRunningTime="2026-04-16 20:20:43.349138847 +0000 UTC m=+526.190039237" Apr 16 20:20:47.347071 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:47.347035 2572 generic.go:358] "Generic (PLEG): container finished" podID="47d5476e-8e4f-4ae8-837c-38e7674327f6" containerID="5bbb568a0e60cbbe71338f72e946d6c326b2ff6c61ec330758fae2d374858cdf" exitCode=0 Apr 16 20:20:47.347513 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:47.347115 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-c9pnp" event={"ID":"47d5476e-8e4f-4ae8-837c-38e7674327f6","Type":"ContainerDied","Data":"5bbb568a0e60cbbe71338f72e946d6c326b2ff6c61ec330758fae2d374858cdf"} Apr 16 20:20:48.474688 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:48.474657 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-c9pnp" Apr 16 20:20:48.525698 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:48.525660 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr5s2\" (UniqueName: \"kubernetes.io/projected/47d5476e-8e4f-4ae8-837c-38e7674327f6-kube-api-access-tr5s2\") pod \"47d5476e-8e4f-4ae8-837c-38e7674327f6\" (UID: \"47d5476e-8e4f-4ae8-837c-38e7674327f6\") " Apr 16 20:20:48.527865 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:48.527837 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d5476e-8e4f-4ae8-837c-38e7674327f6-kube-api-access-tr5s2" (OuterVolumeSpecName: "kube-api-access-tr5s2") pod "47d5476e-8e4f-4ae8-837c-38e7674327f6" (UID: "47d5476e-8e4f-4ae8-837c-38e7674327f6"). InnerVolumeSpecName "kube-api-access-tr5s2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:20:48.627057 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:48.626981 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tr5s2\" (UniqueName: \"kubernetes.io/projected/47d5476e-8e4f-4ae8-837c-38e7674327f6-kube-api-access-tr5s2\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 20:20:49.354547 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:49.354514 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-c9pnp" Apr 16 20:20:49.354742 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:49.354519 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-c9pnp" event={"ID":"47d5476e-8e4f-4ae8-837c-38e7674327f6","Type":"ContainerDied","Data":"e00baf00815ec1f499abc37e9523c03305b01ada2b065f8bb23116d4c5d5bfb4"} Apr 16 20:20:49.354742 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:20:49.354622 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e00baf00815ec1f499abc37e9523c03305b01ada2b065f8bb23116d4c5d5bfb4" Apr 16 20:21:57.666292 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:21:57.666262 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:21:57.667192 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:21:57.667172 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:26:57.690187 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:26:57.690114 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:26:57.691086 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:26:57.691065 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:31:57.712411 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:31:57.712366 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:31:57.714627 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:31:57.714604 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:36:57.733084 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:36:57.733052 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:36:57.735241 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:36:57.735216 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:41:57.755973 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:41:57.755943 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:41:57.758139 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:41:57.758115 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:46:57.781289 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:46:57.781259 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:46:57.783713 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:46:57.783687 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:51:57.803028 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:51:57.802998 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:51:57.806291 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:51:57.806263 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:56:57.831974 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:56:57.831892 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 20:56:57.833889 ip-10-0-137-142 kubenswrapper[2572]: I0416 20:56:57.833865 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 21:01:57.851650 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:01:57.851615 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 21:01:57.855517 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:01:57.855490 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 21:06:57.872795 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:06:57.872763 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 21:06:57.878425 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:06:57.878400 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 21:11:34.209436 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.209402 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cp42t/must-gather-97ghz"] Apr 16 21:11:34.210041 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.209726 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47d5476e-8e4f-4ae8-837c-38e7674327f6" containerName="s3-tls-init-serving" Apr 16 21:11:34.210041 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.209736 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d5476e-8e4f-4ae8-837c-38e7674327f6" containerName="s3-tls-init-serving" Apr 16 21:11:34.210041 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.209780 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="47d5476e-8e4f-4ae8-837c-38e7674327f6" containerName="s3-tls-init-serving" Apr 16 21:11:34.212788 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.212772 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp42t/must-gather-97ghz" Apr 16 21:11:34.214977 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.214957 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cp42t\"/\"kube-root-ca.crt\"" Apr 16 21:11:34.215070 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.214957 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cp42t\"/\"openshift-service-ca.crt\"" Apr 16 21:11:34.219573 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.219505 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cp42t/must-gather-97ghz"] Apr 16 21:11:34.344142 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.344104 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l26hx\" (UniqueName: \"kubernetes.io/projected/05005ec0-6826-432a-89c2-7243bde22e11-kube-api-access-l26hx\") pod \"must-gather-97ghz\" (UID: \"05005ec0-6826-432a-89c2-7243bde22e11\") " pod="openshift-must-gather-cp42t/must-gather-97ghz" Apr 16 21:11:34.344285 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.344209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05005ec0-6826-432a-89c2-7243bde22e11-must-gather-output\") pod \"must-gather-97ghz\" (UID: \"05005ec0-6826-432a-89c2-7243bde22e11\") " pod="openshift-must-gather-cp42t/must-gather-97ghz" Apr 16 21:11:34.444752 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.444725 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05005ec0-6826-432a-89c2-7243bde22e11-must-gather-output\") pod \"must-gather-97ghz\" (UID: \"05005ec0-6826-432a-89c2-7243bde22e11\") " pod="openshift-must-gather-cp42t/must-gather-97ghz" Apr 16 21:11:34.444890 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.444765 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l26hx\" (UniqueName: \"kubernetes.io/projected/05005ec0-6826-432a-89c2-7243bde22e11-kube-api-access-l26hx\") pod \"must-gather-97ghz\" (UID: \"05005ec0-6826-432a-89c2-7243bde22e11\") " pod="openshift-must-gather-cp42t/must-gather-97ghz" Apr 16 21:11:34.445088 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.445064 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05005ec0-6826-432a-89c2-7243bde22e11-must-gather-output\") pod \"must-gather-97ghz\" (UID: \"05005ec0-6826-432a-89c2-7243bde22e11\") " pod="openshift-must-gather-cp42t/must-gather-97ghz" Apr 16 21:11:34.453219 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.453189 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l26hx\" (UniqueName: \"kubernetes.io/projected/05005ec0-6826-432a-89c2-7243bde22e11-kube-api-access-l26hx\") pod \"must-gather-97ghz\" (UID: \"05005ec0-6826-432a-89c2-7243bde22e11\") " pod="openshift-must-gather-cp42t/must-gather-97ghz" Apr 16 21:11:34.534891 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.534812 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp42t/must-gather-97ghz" Apr 16 21:11:34.649745 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.649715 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cp42t/must-gather-97ghz"] Apr 16 21:11:34.653164 ip-10-0-137-142 kubenswrapper[2572]: W0416 21:11:34.653131 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05005ec0_6826_432a_89c2_7243bde22e11.slice/crio-c3d5eaccacb1c24568399e5c5e079184ab6c987459e43fb566aaf6dff5962313 WatchSource:0}: Error finding container c3d5eaccacb1c24568399e5c5e079184ab6c987459e43fb566aaf6dff5962313: Status 404 returned error can't find the container with id c3d5eaccacb1c24568399e5c5e079184ab6c987459e43fb566aaf6dff5962313 Apr 16 21:11:34.654645 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.654630 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:11:34.999155 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:34.999122 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp42t/must-gather-97ghz" event={"ID":"05005ec0-6826-432a-89c2-7243bde22e11","Type":"ContainerStarted","Data":"c3d5eaccacb1c24568399e5c5e079184ab6c987459e43fb566aaf6dff5962313"} Apr 16 21:11:40.015944 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:40.015903 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp42t/must-gather-97ghz" event={"ID":"05005ec0-6826-432a-89c2-7243bde22e11","Type":"ContainerStarted","Data":"052b15e8716fad57470908522245cbb6f380e320fe3bc07e11d2957bdbb0a386"} Apr 16 21:11:40.015944 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:40.015944 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp42t/must-gather-97ghz" event={"ID":"05005ec0-6826-432a-89c2-7243bde22e11","Type":"ContainerStarted","Data":"b58c4594faac86b0a2f606e5b2d2a4d16a3cea014b004d72846a5587385e5ac9"} Apr 16 21:11:40.031790 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:40.031743 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cp42t/must-gather-97ghz" podStartSLOduration=1.622151706 podStartE2EDuration="6.031730706s" podCreationTimestamp="2026-04-16 21:11:34 +0000 UTC" firstStartedPulling="2026-04-16 21:11:34.654771388 +0000 UTC m=+3577.495671756" lastFinishedPulling="2026-04-16 21:11:39.064350388 +0000 UTC m=+3581.905250756" observedRunningTime="2026-04-16 21:11:40.029724646 +0000 UTC m=+3582.870625034" watchObservedRunningTime="2026-04-16 21:11:40.031730706 +0000 UTC m=+3582.872631092" Apr 16 21:11:57.900855 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:57.900827 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 21:11:57.905012 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:57.904988 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 21:11:59.075420 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:59.075351 2572 generic.go:358] "Generic (PLEG): container finished" podID="05005ec0-6826-432a-89c2-7243bde22e11" containerID="b58c4594faac86b0a2f606e5b2d2a4d16a3cea014b004d72846a5587385e5ac9" exitCode=0 Apr 16 21:11:59.075804 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:59.075436 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp42t/must-gather-97ghz" event={"ID":"05005ec0-6826-432a-89c2-7243bde22e11","Type":"ContainerDied","Data":"b58c4594faac86b0a2f606e5b2d2a4d16a3cea014b004d72846a5587385e5ac9"} Apr 16 21:11:59.075804 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:59.075742 2572 scope.go:117] "RemoveContainer" containerID="b58c4594faac86b0a2f606e5b2d2a4d16a3cea014b004d72846a5587385e5ac9" Apr 16 21:11:59.568216 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:11:59.568183 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cp42t_must-gather-97ghz_05005ec0-6826-432a-89c2-7243bde22e11/gather/0.log" Apr 16 21:12:03.040151 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:03.040116 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-66fvr_fd643302-fe55-4675-ba96-c6a539df4ac8/global-pull-secret-syncer/0.log" Apr 16 21:12:03.241062 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:03.241033 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-skfgt_f067f531-c6e2-4ac4-a01e-8aa1872f6296/konnectivity-agent/0.log" Apr 16 21:12:03.320224 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:03.320153 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-142.ec2.internal_df511bf53f5da06c359fd97d0761730a/haproxy/0.log" Apr 16 21:12:05.036503 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:05.036474 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cp42t/must-gather-97ghz"] Apr 16 21:12:05.036902 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:05.036674 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-cp42t/must-gather-97ghz" podUID="05005ec0-6826-432a-89c2-7243bde22e11" containerName="copy" containerID="cri-o://052b15e8716fad57470908522245cbb6f380e320fe3bc07e11d2957bdbb0a386" gracePeriod=2 Apr 16 21:12:05.041173 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:05.041150 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cp42t/must-gather-97ghz"] Apr 16 21:12:05.258875 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:05.258850 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cp42t_must-gather-97ghz_05005ec0-6826-432a-89c2-7243bde22e11/copy/0.log" Apr 16 21:12:05.259239 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:05.259221 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp42t/must-gather-97ghz" Apr 16 21:12:05.261146 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:05.261125 2572 status_manager.go:895] "Failed to get status for pod" podUID="05005ec0-6826-432a-89c2-7243bde22e11" pod="openshift-must-gather-cp42t/must-gather-97ghz" err="pods \"must-gather-97ghz\" is forbidden: User \"system:node:ip-10-0-137-142.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-cp42t\": no relationship found between node 'ip-10-0-137-142.ec2.internal' and this object" Apr 16 21:12:05.426239 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:05.426149 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l26hx\" (UniqueName: \"kubernetes.io/projected/05005ec0-6826-432a-89c2-7243bde22e11-kube-api-access-l26hx\") pod \"05005ec0-6826-432a-89c2-7243bde22e11\" (UID: \"05005ec0-6826-432a-89c2-7243bde22e11\") " Apr 16 21:12:05.426239 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:05.426207 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05005ec0-6826-432a-89c2-7243bde22e11-must-gather-output\") pod \"05005ec0-6826-432a-89c2-7243bde22e11\" (UID: \"05005ec0-6826-432a-89c2-7243bde22e11\") " Apr 16 21:12:05.427731 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:05.427701 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05005ec0-6826-432a-89c2-7243bde22e11-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "05005ec0-6826-432a-89c2-7243bde22e11" (UID: "05005ec0-6826-432a-89c2-7243bde22e11"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:12:05.428504 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:05.428483 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05005ec0-6826-432a-89c2-7243bde22e11-kube-api-access-l26hx" (OuterVolumeSpecName: "kube-api-access-l26hx") pod "05005ec0-6826-432a-89c2-7243bde22e11" (UID: "05005ec0-6826-432a-89c2-7243bde22e11"). InnerVolumeSpecName "kube-api-access-l26hx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:12:05.527281 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:05.527249 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l26hx\" (UniqueName: \"kubernetes.io/projected/05005ec0-6826-432a-89c2-7243bde22e11-kube-api-access-l26hx\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 21:12:05.527281 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:05.527277 2572 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05005ec0-6826-432a-89c2-7243bde22e11-must-gather-output\") on node \"ip-10-0-137-142.ec2.internal\" DevicePath \"\"" Apr 16 21:12:05.762720 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:05.762688 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05005ec0-6826-432a-89c2-7243bde22e11" path="/var/lib/kubelet/pods/05005ec0-6826-432a-89c2-7243bde22e11/volumes" Apr 16 21:12:06.096222 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.096147 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cp42t_must-gather-97ghz_05005ec0-6826-432a-89c2-7243bde22e11/copy/0.log" Apr 16 21:12:06.096672 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.096470 2572 generic.go:358] "Generic (PLEG): container finished" podID="05005ec0-6826-432a-89c2-7243bde22e11" containerID="052b15e8716fad57470908522245cbb6f380e320fe3bc07e11d2957bdbb0a386" exitCode=143 Apr 16 21:12:06.096672 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.096512 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp42t/must-gather-97ghz" Apr 16 21:12:06.096672 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.096561 2572 scope.go:117] "RemoveContainer" containerID="052b15e8716fad57470908522245cbb6f380e320fe3bc07e11d2957bdbb0a386" Apr 16 21:12:06.104178 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.104155 2572 scope.go:117] "RemoveContainer" containerID="b58c4594faac86b0a2f606e5b2d2a4d16a3cea014b004d72846a5587385e5ac9" Apr 16 21:12:06.116561 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.116538 2572 scope.go:117] "RemoveContainer" containerID="052b15e8716fad57470908522245cbb6f380e320fe3bc07e11d2957bdbb0a386" Apr 16 21:12:06.116841 ip-10-0-137-142 kubenswrapper[2572]: E0416 21:12:06.116820 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"052b15e8716fad57470908522245cbb6f380e320fe3bc07e11d2957bdbb0a386\": container with ID starting with 052b15e8716fad57470908522245cbb6f380e320fe3bc07e11d2957bdbb0a386 not found: ID does not exist" containerID="052b15e8716fad57470908522245cbb6f380e320fe3bc07e11d2957bdbb0a386" Apr 16 21:12:06.116903 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.116850 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"052b15e8716fad57470908522245cbb6f380e320fe3bc07e11d2957bdbb0a386"} err="failed to get container status \"052b15e8716fad57470908522245cbb6f380e320fe3bc07e11d2957bdbb0a386\": rpc error: code = NotFound desc = could not find container \"052b15e8716fad57470908522245cbb6f380e320fe3bc07e11d2957bdbb0a386\": container with ID starting with 052b15e8716fad57470908522245cbb6f380e320fe3bc07e11d2957bdbb0a386 not found: ID does not exist" Apr 16 21:12:06.116903 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.116873 2572 scope.go:117] "RemoveContainer" containerID="b58c4594faac86b0a2f606e5b2d2a4d16a3cea014b004d72846a5587385e5ac9" Apr 16 21:12:06.117090 ip-10-0-137-142 kubenswrapper[2572]: E0416 21:12:06.117074 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b58c4594faac86b0a2f606e5b2d2a4d16a3cea014b004d72846a5587385e5ac9\": container with ID starting with b58c4594faac86b0a2f606e5b2d2a4d16a3cea014b004d72846a5587385e5ac9 not found: ID does not exist" containerID="b58c4594faac86b0a2f606e5b2d2a4d16a3cea014b004d72846a5587385e5ac9" Apr 16 21:12:06.117130 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.117095 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b58c4594faac86b0a2f606e5b2d2a4d16a3cea014b004d72846a5587385e5ac9"} err="failed to get container status \"b58c4594faac86b0a2f606e5b2d2a4d16a3cea014b004d72846a5587385e5ac9\": rpc error: code = NotFound desc = could not find container \"b58c4594faac86b0a2f606e5b2d2a4d16a3cea014b004d72846a5587385e5ac9\": container with ID starting with b58c4594faac86b0a2f606e5b2d2a4d16a3cea014b004d72846a5587385e5ac9 not found: ID does not exist" Apr 16 21:12:06.563332 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.563301 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c5bf8133-b462-4194-98ad-2c8df9714e07/alertmanager/0.log" Apr 16 21:12:06.595970 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.595943 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c5bf8133-b462-4194-98ad-2c8df9714e07/config-reloader/0.log" Apr 16 21:12:06.620961 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.620940 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c5bf8133-b462-4194-98ad-2c8df9714e07/kube-rbac-proxy-web/0.log" Apr 16 21:12:06.644527 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.644507 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c5bf8133-b462-4194-98ad-2c8df9714e07/kube-rbac-proxy/0.log" Apr 16 21:12:06.671109 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.671068 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c5bf8133-b462-4194-98ad-2c8df9714e07/kube-rbac-proxy-metric/0.log" Apr 16 21:12:06.696347 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.696321 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c5bf8133-b462-4194-98ad-2c8df9714e07/prom-label-proxy/0.log" Apr 16 21:12:06.720192 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.720168 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c5bf8133-b462-4194-98ad-2c8df9714e07/init-config-reloader/0.log" Apr 16 21:12:06.930585 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.930508 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6l62g_d756c161-ee06-43f3-8ef8-ba201a79c470/node-exporter/0.log" Apr 16 21:12:06.954103 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.954081 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6l62g_d756c161-ee06-43f3-8ef8-ba201a79c470/kube-rbac-proxy/0.log" Apr 16 21:12:06.980647 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:06.980618 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6l62g_d756c161-ee06-43f3-8ef8-ba201a79c470/init-textfile/0.log" Apr 16 21:12:07.243305 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:07.243277 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28a79339-d1e7-41e6-9650-1c9f4e63fb75/prometheus/0.log" Apr 16 21:12:07.261655 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:07.261632 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28a79339-d1e7-41e6-9650-1c9f4e63fb75/config-reloader/0.log" Apr 16 21:12:07.283657 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:07.283636 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28a79339-d1e7-41e6-9650-1c9f4e63fb75/thanos-sidecar/0.log" Apr 16 21:12:07.305095 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:07.305076 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28a79339-d1e7-41e6-9650-1c9f4e63fb75/kube-rbac-proxy-web/0.log" Apr 16 21:12:07.331464 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:07.331443 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28a79339-d1e7-41e6-9650-1c9f4e63fb75/kube-rbac-proxy/0.log" Apr 16 21:12:07.355623 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:07.355605 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28a79339-d1e7-41e6-9650-1c9f4e63fb75/kube-rbac-proxy-thanos/0.log" Apr 16 21:12:07.380036 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:07.380016 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28a79339-d1e7-41e6-9650-1c9f4e63fb75/init-config-reloader/0.log" Apr 16 21:12:09.271235 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:09.271199 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/2.log" Apr 16 21:12:09.275647 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:09.275623 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-j4pl4_80a9a559-faef-43f2-ae15-5d0a784691b5/console-operator/3.log" Apr 16 21:12:10.086663 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.086626 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg"] Apr 16 21:12:10.086931 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.086919 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05005ec0-6826-432a-89c2-7243bde22e11" containerName="copy" Apr 16 21:12:10.086974 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.086932 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="05005ec0-6826-432a-89c2-7243bde22e11" containerName="copy" Apr 16 21:12:10.086974 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.086945 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05005ec0-6826-432a-89c2-7243bde22e11" containerName="gather" Apr 16 21:12:10.086974 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.086950 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="05005ec0-6826-432a-89c2-7243bde22e11" containerName="gather" Apr 16 21:12:10.087057 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.086998 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="05005ec0-6826-432a-89c2-7243bde22e11" containerName="copy" Apr 16 21:12:10.087057 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.087010 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="05005ec0-6826-432a-89c2-7243bde22e11" containerName="gather" Apr 16 21:12:10.089768 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.089747 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.092295 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.092269 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g6jdf\"/\"openshift-service-ca.crt\"" Apr 16 21:12:10.092413 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.092296 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-g6jdf\"/\"default-dockercfg-w7zrq\"" Apr 16 21:12:10.093084 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.093070 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g6jdf\"/\"kube-root-ca.crt\"" Apr 16 21:12:10.098306 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.098284 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg"] Apr 16 21:12:10.264398 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.264337 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4c10831d-a17c-478a-bb56-6675006abeb3-podres\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.264571 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.264420 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c10831d-a17c-478a-bb56-6675006abeb3-sys\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.264571 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.264437 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6sk9\" (UniqueName: \"kubernetes.io/projected/4c10831d-a17c-478a-bb56-6675006abeb3-kube-api-access-m6sk9\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.264571 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.264465 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4c10831d-a17c-478a-bb56-6675006abeb3-proc\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.264571 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.264537 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c10831d-a17c-478a-bb56-6675006abeb3-lib-modules\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.365757 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.365686 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4c10831d-a17c-478a-bb56-6675006abeb3-podres\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.365757 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.365740 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c10831d-a17c-478a-bb56-6675006abeb3-sys\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.366112 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.365788 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c10831d-a17c-478a-bb56-6675006abeb3-sys\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.366112 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.365816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6sk9\" (UniqueName: \"kubernetes.io/projected/4c10831d-a17c-478a-bb56-6675006abeb3-kube-api-access-m6sk9\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.366112 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.365832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4c10831d-a17c-478a-bb56-6675006abeb3-proc\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.366112 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.365841 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4c10831d-a17c-478a-bb56-6675006abeb3-podres\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.366112 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.365860 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c10831d-a17c-478a-bb56-6675006abeb3-lib-modules\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.366112 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.365926 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4c10831d-a17c-478a-bb56-6675006abeb3-proc\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.366112 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.365976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c10831d-a17c-478a-bb56-6675006abeb3-lib-modules\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.373570 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.373542 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6sk9\" (UniqueName: \"kubernetes.io/projected/4c10831d-a17c-478a-bb56-6675006abeb3-kube-api-access-m6sk9\") pod \"perf-node-gather-daemonset-94bmg\" (UID: \"4c10831d-a17c-478a-bb56-6675006abeb3\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.399894 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.399877 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:10.519748 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.519718 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg"] Apr 16 21:12:10.523802 ip-10-0-137-142 kubenswrapper[2572]: W0416 21:12:10.523775 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4c10831d_a17c_478a_bb56_6675006abeb3.slice/crio-533dc100a58ad4495657bb7364b610e07e8d4e7068b5e01a358b598538abf80e WatchSource:0}: Error finding container 533dc100a58ad4495657bb7364b610e07e8d4e7068b5e01a358b598538abf80e: Status 404 returned error can't find the container with id 533dc100a58ad4495657bb7364b610e07e8d4e7068b5e01a358b598538abf80e Apr 16 21:12:10.689875 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.689848 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7f5fs_91eb91d1-3690-4158-98a2-3eecf9955cda/dns/0.log" Apr 16 21:12:10.713751 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.713725 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7f5fs_91eb91d1-3690-4158-98a2-3eecf9955cda/kube-rbac-proxy/0.log" Apr 16 21:12:10.879246 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:10.879216 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fjbmm_8ee16202-241d-45ac-9219-3363704a708e/dns-node-resolver/0.log" Apr 16 21:12:11.112285 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:11.112197 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" event={"ID":"4c10831d-a17c-478a-bb56-6675006abeb3","Type":"ContainerStarted","Data":"ef0734a5978395ed06f29f41ac0bf40a5c046fa347c7ada85ca47c629c4f626c"} Apr 16 21:12:11.112285 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:11.112231 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" event={"ID":"4c10831d-a17c-478a-bb56-6675006abeb3","Type":"ContainerStarted","Data":"533dc100a58ad4495657bb7364b610e07e8d4e7068b5e01a358b598538abf80e"} Apr 16 21:12:11.112285 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:11.112261 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:11.133672 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:11.133625 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" podStartSLOduration=1.133611222 podStartE2EDuration="1.133611222s" podCreationTimestamp="2026-04-16 21:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:12:11.131790297 +0000 UTC m=+3613.972690688" watchObservedRunningTime="2026-04-16 21:12:11.133611222 +0000 UTC m=+3613.974511611" Apr 16 21:12:11.374967 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:11.374892 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7f6d57ffbc-fxbrb_91700673-6298-4117-96a8-e8b9068e453b/registry/0.log" Apr 16 21:12:11.441170 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:11.441141 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-r8gdb_bfde86ea-03d1-4cf4-90b7-76b04a98def5/node-ca/0.log" Apr 16 21:12:12.515194 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:12.515166 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2pcp4_efcf8c22-a25f-4709-a840-c85cec57a1b9/serve-healthcheck-canary/0.log" Apr 16 21:12:12.925437 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:12.925346 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5rcs6_c31958fa-caf8-4c77-a312-e2d0f8238e6f/kube-rbac-proxy/0.log" Apr 16 21:12:12.949345 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:12.949317 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5rcs6_c31958fa-caf8-4c77-a312-e2d0f8238e6f/exporter/0.log" Apr 16 21:12:12.970471 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:12.970450 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5rcs6_c31958fa-caf8-4c77-a312-e2d0f8238e6f/extractor/0.log" Apr 16 21:12:15.311369 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:15.311339 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-pwdsz_684483e5-70e3-4c78-96b1-9f89739632c1/manager/0.log" Apr 16 21:12:15.330456 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:15.330430 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-b5bbm_8b942c2d-55d1-4457-bbe9-1da776c464d5/s3-init/0.log" Apr 16 21:12:15.354732 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:15.354707 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-w6pl6_8da9ec91-abb1-4b12-9a66-cf545376deec/s3-tls-init-custom/0.log" Apr 16 21:12:15.377792 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:15.377768 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-c9pnp_47d5476e-8e4f-4ae8-837c-38e7674327f6/s3-tls-init-serving/0.log" Apr 16 21:12:17.124801 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:17.124772 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-94bmg" Apr 16 21:12:19.365149 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:19.365081 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-k89vg_4f6d31f0-9d73-4517-8bee-56833892973e/migrator/0.log" Apr 16 21:12:19.388166 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:19.388135 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-k89vg_4f6d31f0-9d73-4517-8bee-56833892973e/graceful-termination/0.log" Apr 16 21:12:21.099153 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:21.099128 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zq5ft_10f2b7ab-1884-4d1c-8207-ad7844c2b18f/kube-multus-additional-cni-plugins/0.log" Apr 16 21:12:21.122106 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:21.122079 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zq5ft_10f2b7ab-1884-4d1c-8207-ad7844c2b18f/egress-router-binary-copy/0.log" Apr 16 21:12:21.143603 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:21.143575 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zq5ft_10f2b7ab-1884-4d1c-8207-ad7844c2b18f/cni-plugins/0.log" Apr 16 21:12:21.165988 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:21.165945 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zq5ft_10f2b7ab-1884-4d1c-8207-ad7844c2b18f/bond-cni-plugin/0.log" Apr 16 21:12:21.188605 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:21.188582 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zq5ft_10f2b7ab-1884-4d1c-8207-ad7844c2b18f/routeoverride-cni/0.log" Apr 16 21:12:21.211418 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:21.211399 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zq5ft_10f2b7ab-1884-4d1c-8207-ad7844c2b18f/whereabouts-cni-bincopy/0.log" Apr 16 21:12:21.232510 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:21.232484 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zq5ft_10f2b7ab-1884-4d1c-8207-ad7844c2b18f/whereabouts-cni/0.log" Apr 16 21:12:21.261128 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:21.261102 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hqvbt_46e6f6f6-a7c4-41b7-bbc7-d736051ed7a3/kube-multus/0.log" Apr 16 21:12:21.361141 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:21.361120 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-622d4_d285ba82-dded-4707-87cb-35b755280286/network-metrics-daemon/0.log" Apr 16 21:12:21.382627 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:21.382606 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-622d4_d285ba82-dded-4707-87cb-35b755280286/kube-rbac-proxy/0.log" Apr 16 21:12:22.427778 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:22.427752 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgjgp_6b1cd9c4-abb0-4659-b9b8-0b263412063c/ovn-controller/0.log" Apr 16 21:12:22.463171 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:22.463137 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgjgp_6b1cd9c4-abb0-4659-b9b8-0b263412063c/ovn-acl-logging/0.log" Apr 16 21:12:22.482080 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:22.482058 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgjgp_6b1cd9c4-abb0-4659-b9b8-0b263412063c/kube-rbac-proxy-node/0.log" Apr 16 21:12:22.504078 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:22.504053 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgjgp_6b1cd9c4-abb0-4659-b9b8-0b263412063c/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 21:12:22.524353 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:22.524333 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgjgp_6b1cd9c4-abb0-4659-b9b8-0b263412063c/northd/0.log" Apr 16 21:12:22.562787 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:22.562764 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgjgp_6b1cd9c4-abb0-4659-b9b8-0b263412063c/nbdb/0.log" Apr 16 21:12:22.588012 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:22.587987 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgjgp_6b1cd9c4-abb0-4659-b9b8-0b263412063c/sbdb/0.log" Apr 16 21:12:22.691468 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:22.691443 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgjgp_6b1cd9c4-abb0-4659-b9b8-0b263412063c/ovnkube-controller/0.log" Apr 16 21:12:24.106886 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:24.106859 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-2xqx8_5ea99809-04f6-4ff1-adef-7bf9eb98c772/network-check-target-container/0.log" Apr 16 21:12:24.980454 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:24.980429 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-4g8jb_2ecca5d6-192a-4507-b71b-c8d9e9099230/iptables-alerter/0.log" Apr 16 21:12:25.679427 ip-10-0-137-142 kubenswrapper[2572]: I0416 21:12:25.679369 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-9r8n4_f6d06c44-2c08-4c0f-a3c1-ffc61c9b0a66/tuned/0.log"