Apr 16 16:45:27.466239 ip-10-0-143-10 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 16:45:27.466251 ip-10-0-143-10 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 16:45:27.466258 ip-10-0-143-10 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 16:45:27.466494 ip-10-0-143-10 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 16:45:37.607309 ip-10-0-143-10 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 16:45:37.607326 ip-10-0-143-10 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot ae1ebf41a8c946a88128bca592ea7bec -- Apr 16 16:48:37.613065 ip-10-0-143-10 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:48:38.026516 ip-10-0-143-10 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:48:38.026516 ip-10-0-143-10 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:48:38.026516 ip-10-0-143-10 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:48:38.026516 ip-10-0-143-10 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:48:38.026516 ip-10-0-143-10 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:48:38.027829 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.027736 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:48:38.030215 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030193 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:48:38.030215 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030209 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:48:38.030215 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030214 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:48:38.030215 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030219 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:48:38.030215 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030223 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030228 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030232 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030236 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030240 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030243 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030247 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030250 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030254 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030258 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030261 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030265 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030269 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030273 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030277 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030280 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030284 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030288 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030292 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:48:38.030502 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030301 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030305 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030309 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030313 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030317 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030322 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030326 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030334 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030340 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030345 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030350 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030357 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030364 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030369 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030374 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030380 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030384 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030389 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030393 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:48:38.031288 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030397 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030401 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030406 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030410 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030415 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030419 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030423 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030427 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030432 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030436 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030441 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030445 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030449 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030453 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030457 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030461 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030465 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030470 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030474 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030480 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:48:38.031994 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030485 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030490 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030494 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030498 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030502 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030506 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030510 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030515 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030519 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030524 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030528 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030532 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030538 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030543 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030549 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030553 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030558 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030562 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030566 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030570 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:48:38.032550 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030575 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030580 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030585 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.030589 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031226 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031236 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031241 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031247 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031253 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031257 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031262 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031267 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031271 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031276 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031280 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031284 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031288 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031292 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031296 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:48:38.033414 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031301 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031305 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031309 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031314 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031318 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031323 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031328 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031332 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031336 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031340 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031345 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031349 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031353 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031358 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031362 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031366 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031370 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031374 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031378 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031383 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:48:38.034115 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031387 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031391 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031395 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031400 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031404 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031409 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031413 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031417 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031421 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031425 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031430 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031434 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031441 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031447 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031453 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031458 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031463 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031467 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031473 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:48:38.034749 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031478 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031482 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031487 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031491 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031495 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031500 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031505 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031509 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031513 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031517 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031521 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031525 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031530 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031534 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031538 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031542 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031548 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031553 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031557 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031561 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:48:38.035316 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031566 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031571 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031576 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031580 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031584 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031588 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031593 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031597 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031601 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031605 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031610 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.031613 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032389 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032404 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032415 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032421 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032429 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032434 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032441 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032449 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032454 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:48:38.035949 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032459 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032466 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032471 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032477 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032482 2574 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032487 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032492 2574 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032497 2574 flags.go:64] FLAG: --cloud-config="" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032501 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032506 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032513 2574 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032518 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032523 2574 flags.go:64] FLAG: --config-dir="" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032528 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032534 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032540 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032545 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032551 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032556 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032561 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032566 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032571 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032576 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032581 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032589 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:48:38.036452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032595 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032599 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032604 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032609 2574 flags.go:64] FLAG: --enable-server="true" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032614 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032621 2574 flags.go:64] FLAG: --event-burst="100" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032626 2574 flags.go:64] FLAG: --event-qps="50" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032630 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032636 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032640 2574 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032646 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032668 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032674 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032679 2574 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032684 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032688 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032694 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032699 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032703 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032709 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032713 2574 flags.go:64] FLAG: --feature-gates="" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032720 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032725 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032730 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032735 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032741 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:48:38.037174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032747 2574 flags.go:64] FLAG: --help="false" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032752 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032757 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032762 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032767 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032773 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032781 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032786 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032791 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032796 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032800 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032805 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032810 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032815 2574 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032820 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032824 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032830 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032834 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032839 2574 flags.go:64] FLAG: --lock-file="" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032844 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032848 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032854 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032863 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:48:38.037836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032869 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032874 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032879 2574 flags.go:64] FLAG: --logging-format="text" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032883 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032889 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032894 2574 flags.go:64] FLAG: --manifest-url="" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032899 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032906 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032911 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032917 2574 flags.go:64] FLAG: --max-pods="110" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032922 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032928 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032933 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032938 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032943 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032947 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032953 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032967 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032973 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032978 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032983 2574 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032988 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.032997 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033002 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:48:38.038386 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033007 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033012 2574 flags.go:64] FLAG: --port="10250" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033017 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033029 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02bd2210c4a54e890" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033039 2574 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033044 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033049 2574 flags.go:64] FLAG: --register-node="true" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033053 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033058 2574 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033064 2574 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033069 2574 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033074 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033078 2574 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033085 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033090 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033095 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033099 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033104 2574 flags.go:64] FLAG: --runonce="false" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033111 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033117 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033122 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033126 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033131 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033136 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033141 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033147 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:48:38.038989 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033152 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033156 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033161 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033166 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033171 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033176 2574 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033180 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033189 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033194 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033199 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033208 2574 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033213 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033218 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033223 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033227 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033232 2574 flags.go:64] FLAG: --v="2" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033239 2574 flags.go:64] FLAG: --version="false" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033245 2574 flags.go:64] FLAG: --vmodule="" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033252 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.033257 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033414 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033421 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033426 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033430 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:48:38.039613 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033437 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033441 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033445 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033450 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033454 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033459 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033464 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033468 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033473 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033478 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033482 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033486 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033490 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033495 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033499 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033504 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033508 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033513 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033518 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033522 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:48:38.040257 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033526 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033530 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033535 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033539 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033543 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033548 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033552 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033557 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033561 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033565 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033569 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033574 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033580 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033584 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033588 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033593 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033598 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033602 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033607 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033611 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:48:38.040787 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033615 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033620 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033625 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033629 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033633 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033637 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033641 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033646 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033650 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033670 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033674 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033679 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033683 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033687 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033692 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033699 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033705 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033710 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033714 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033718 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:48:38.041293 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033722 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033727 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033731 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033735 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033741 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033746 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033750 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033755 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033760 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033764 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033768 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033773 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033777 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033782 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033787 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033791 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033798 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033804 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033809 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:48:38.041829 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033813 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033818 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.033823 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.034445 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.041136 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.041153 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041205 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041210 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041214 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041217 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041220 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041223 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041227 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041230 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041233 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041235 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:48:38.042300 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041238 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041241 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041245 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041250 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041254 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041259 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041262 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041265 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041268 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041271 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041274 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041276 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041279 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041281 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041284 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041287 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041289 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041292 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041295 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:48:38.042723 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041298 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041300 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041306 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041309 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041312 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041315 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041317 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041320 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041323 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041325 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041328 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041331 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041333 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041336 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041339 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041341 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041344 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041346 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041349 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041352 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:48:38.043187 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041354 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041357 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041360 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041362 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041365 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041367 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041370 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041372 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041375 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041377 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041380 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041383 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041385 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041388 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041392 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041395 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041398 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041401 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041403 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041406 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:48:38.043682 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041408 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041411 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041414 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041416 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041419 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041421 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041424 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041426 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041429 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041431 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041434 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041437 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041440 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041442 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041445 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041447 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:48:38.044209 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041450 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.041455 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041572 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041578 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041581 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041584 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041587 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041590 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041594 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041597 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041600 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041603 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041606 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041609 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041612 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041615 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:48:38.044593 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041618 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041620 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041623 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041626 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041629 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041631 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041634 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041636 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041639 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041643 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041646 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041649 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041666 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041669 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041672 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041675 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041678 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041681 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041684 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:48:38.045039 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041687 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041689 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041692 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041695 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041698 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041700 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041703 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041705 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041709 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041712 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041715 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041717 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041720 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041723 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041725 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041728 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041730 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041733 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041735 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041738 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:48:38.045505 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041740 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041743 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041745 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041748 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041751 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041753 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041756 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041758 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041761 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041763 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041766 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041768 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041771 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041773 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041776 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041779 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041781 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041784 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041786 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:48:38.046111 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041789 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041793 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041796 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041799 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041802 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041805 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041808 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041810 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041813 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041816 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041819 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041821 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041823 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:38.041826 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.041830 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:48:38.046582 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.041962 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:48:38.046977 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.044013 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:48:38.046977 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.044916 2574 server.go:1019] "Starting client certificate rotation" Apr 16 16:48:38.046977 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.045013 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:48:38.046977 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.045061 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:48:38.069554 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.069530 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:48:38.072599 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.072581 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:48:38.085723 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.085695 2574 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:48:38.090670 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.090637 2574 log.go:25] "Validated CRI v1 image API" Apr 16 16:48:38.091759 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.091740 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:48:38.094586 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.094565 2574 fs.go:135] Filesystem UUIDs: map[238df94d-89d4-4ec2-9ccb-0c47b73fbcfe:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 c1b4e66c-2a28-465b-b6fa-c2b4ff69533d:/dev/nvme0n1p4] Apr 16 16:48:38.094670 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.094590 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:48:38.100339 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.100231 2574 manager.go:217] Machine: {Timestamp:2026-04-16 16:48:38.098341718 +0000 UTC m=+0.381367286 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100252 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c925ab7f89abd2f998d151e3dedd6 SystemUUID:ec2c925a-b7f8-9abd-2f99-8d151e3dedd6 BootID:ae1ebf41-a8c9-46a8-8128-bca592ea7bec Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:52:04:15:03:d9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:52:04:15:03:d9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:50:40:03:e0:11 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:48:38.100339 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.100334 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:48:38.100440 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.100419 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:48:38.101222 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.101202 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:48:38.102859 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.102830 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:48:38.103010 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.102862 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-10.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:48:38.103060 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.103022 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:48:38.103060 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.103031 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:48:38.103060 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.103047 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:48:38.103692 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.103682 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:48:38.105380 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.105370 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:48:38.105487 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.105478 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:48:38.107626 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.107616 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:48:38.107678 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.107634 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:48:38.107678 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.107647 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:48:38.107678 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.107668 2574 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:48:38.107678 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.107678 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:48:38.108781 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.108768 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:48:38.108842 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.108786 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:48:38.111597 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.111581 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:48:38.113284 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.113267 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:48:38.114341 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.114327 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:48:38.114410 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.114346 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:48:38.114410 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.114356 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:48:38.114410 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.114364 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:48:38.114410 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.114373 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:48:38.114410 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.114381 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:48:38.114410 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.114391 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:48:38.114410 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.114399 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:48:38.114410 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.114409 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:48:38.114681 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.114424 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:48:38.114681 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.114442 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:48:38.114681 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.114456 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:48:38.115299 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.115287 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:48:38.115356 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.115304 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:48:38.117263 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.117244 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-10.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:48:38.117357 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.117303 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-10.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:48:38.117357 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.117331 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:48:38.118831 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.118818 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:48:38.118901 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.118854 2574 server.go:1295] "Started kubelet" Apr 16 16:48:38.118975 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.118946 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:48:38.119027 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.118948 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:48:38.119027 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.119014 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:48:38.119696 ip-10-0-143-10 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:48:38.120227 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.120173 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:48:38.122628 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.122194 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:48:38.124010 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.123985 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9dl8b" Apr 16 16:48:38.125854 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.125836 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:48:38.126411 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.126397 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:48:38.127505 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.127492 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:48:38.127505 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.127504 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:48:38.127644 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.127623 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:48:38.127644 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.127631 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:48:38.127644 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.127621 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:48:38.127823 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.123931 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-10.ec2.internal.18a6e44c005aae5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-10.ec2.internal,UID:ip-10-0-143-10.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-10.ec2.internal,},FirstTimestamp:2026-04-16 16:48:38.11882966 +0000 UTC m=+0.401855231,LastTimestamp:2026-04-16 16:48:38.11882966 +0000 UTC m=+0.401855231,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-10.ec2.internal,}" Apr 16 16:48:38.128065 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.128031 2574 factory.go:55] Registering systemd factory Apr 16 16:48:38.128065 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.128046 2574 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:48:38.128408 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.128385 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:38.128486 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.128458 2574 factory.go:153] Registering CRI-O factory Apr 16 16:48:38.128486 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.128473 2574 factory.go:223] Registration of the crio container factory successfully Apr 16 16:48:38.128595 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.128566 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:48:38.128595 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.128586 2574 factory.go:103] Registering Raw factory Apr 16 16:48:38.128703 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.128603 2574 manager.go:1196] Started watching for new ooms in manager Apr 16 16:48:38.129043 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.129016 2574 manager.go:319] Starting recovery of all containers Apr 16 16:48:38.129471 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.129448 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:48:38.132630 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.132608 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9dl8b" Apr 16 16:48:38.138891 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.138873 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:48:38.139970 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.139955 2574 manager.go:324] Recovery completed Apr 16 16:48:38.142143 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.142129 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-10.ec2.internal\" not found" node="ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.143965 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.143953 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:48:38.146273 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.146259 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:48:38.146366 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.146291 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:48:38.146366 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.146306 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:48:38.146817 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.146804 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:48:38.146817 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.146815 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:48:38.146917 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.146833 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:48:38.150033 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.150014 2574 policy_none.go:49] "None policy: Start" Apr 16 16:48:38.150033 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.150031 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:48:38.150123 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.150041 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:48:38.214371 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.191566 2574 manager.go:341] "Starting Device Plugin manager" Apr 16 16:48:38.214371 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.191593 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:48:38.214371 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.191602 2574 server.go:85] "Starting device plugin registration server" Apr 16 16:48:38.214371 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.191817 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:48:38.214371 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.191827 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:48:38.214371 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.191904 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:48:38.214371 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.191984 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:48:38.214371 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.191992 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:48:38.214371 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.192725 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:48:38.214371 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.192757 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:38.268405 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.268374 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:48:38.269565 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.269551 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:48:38.269634 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.269577 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:48:38.269634 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.269595 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:48:38.269634 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.269602 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:48:38.269769 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.269634 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:48:38.273927 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.273909 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:48:38.292380 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.292340 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:48:38.293304 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.293289 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:48:38.293380 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.293318 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:48:38.293380 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.293327 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:48:38.293380 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.293349 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.301664 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.301638 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.301712 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.301676 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-10.ec2.internal\": node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:38.321198 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.321176 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:38.370153 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.370115 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal"] Apr 16 16:48:38.370258 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.370205 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:48:38.371123 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.371108 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:48:38.371214 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.371140 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:48:38.371214 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.371154 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:48:38.374398 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.374383 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:48:38.374551 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.374536 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.374599 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.374566 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:48:38.375131 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.375109 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:48:38.375210 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.375136 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:48:38.375210 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.375145 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:48:38.375210 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.375111 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:48:38.375309 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.375211 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:48:38.375309 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.375224 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:48:38.378173 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.378159 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.378218 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.378184 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:48:38.378852 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.378837 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:48:38.378925 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.378861 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:48:38.378925 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.378874 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:48:38.400165 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.400144 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-10.ec2.internal\" not found" node="ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.404223 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.404208 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-10.ec2.internal\" not found" node="ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.421366 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.421344 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:38.522050 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.522014 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:38.529295 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.529271 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f1bc04e409baab07763d6ca236ceaf1a-config\") pod \"kube-apiserver-proxy-ip-10-0-143-10.ec2.internal\" (UID: \"f1bc04e409baab07763d6ca236ceaf1a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.529361 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.529302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1e831993780eeb4b1e48fbb6fdbf0f02-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal\" (UID: \"1e831993780eeb4b1e48fbb6fdbf0f02\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.529361 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.529322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e831993780eeb4b1e48fbb6fdbf0f02-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal\" (UID: \"1e831993780eeb4b1e48fbb6fdbf0f02\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.622729 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.622677 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:38.629988 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.629964 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1e831993780eeb4b1e48fbb6fdbf0f02-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal\" (UID: \"1e831993780eeb4b1e48fbb6fdbf0f02\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.630036 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.629994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e831993780eeb4b1e48fbb6fdbf0f02-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal\" (UID: \"1e831993780eeb4b1e48fbb6fdbf0f02\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.630036 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.630010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f1bc04e409baab07763d6ca236ceaf1a-config\") pod \"kube-apiserver-proxy-ip-10-0-143-10.ec2.internal\" (UID: \"f1bc04e409baab07763d6ca236ceaf1a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.630100 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.630088 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1e831993780eeb4b1e48fbb6fdbf0f02-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal\" (UID: \"1e831993780eeb4b1e48fbb6fdbf0f02\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.630141 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.630125 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e831993780eeb4b1e48fbb6fdbf0f02-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal\" (UID: \"1e831993780eeb4b1e48fbb6fdbf0f02\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.630177 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.630166 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f1bc04e409baab07763d6ca236ceaf1a-config\") pod \"kube-apiserver-proxy-ip-10-0-143-10.ec2.internal\" (UID: \"f1bc04e409baab07763d6ca236ceaf1a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.704154 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.704115 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.706680 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:38.706648 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" Apr 16 16:48:38.723245 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.723219 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:38.823769 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.823733 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:38.924271 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:38.924201 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:39.024756 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:39.024724 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:39.045165 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.045141 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:48:39.045787 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.045285 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:48:39.045787 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.045305 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:48:39.125218 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:39.125199 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:39.126141 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.126125 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:48:39.134833 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.134802 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 16:43:38 +0000 UTC" deadline="2027-11-02 07:36:45.880781436 +0000 UTC" Apr 16 16:48:39.134833 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.134829 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13550h48m6.745955526s" Apr 16 16:48:39.137348 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.137329 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:48:39.155030 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.155005 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qzvgc" Apr 16 16:48:39.162906 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.162885 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qzvgc" Apr 16 16:48:39.225860 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:39.225810 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:39.245680 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:39.245634 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e831993780eeb4b1e48fbb6fdbf0f02.slice/crio-edf5e6fa971907d2934bca7adafcd9160ccfd3f4ad2d157132a6908ff01335df WatchSource:0}: Error finding container edf5e6fa971907d2934bca7adafcd9160ccfd3f4ad2d157132a6908ff01335df: Status 404 returned error can't find the container with id edf5e6fa971907d2934bca7adafcd9160ccfd3f4ad2d157132a6908ff01335df Apr 16 16:48:39.245876 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:39.245858 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1bc04e409baab07763d6ca236ceaf1a.slice/crio-ab799289adfbe6ee466ba4f06773553ac2fafd2a227a39ab185c7e3c6a6e4c02 WatchSource:0}: Error finding container ab799289adfbe6ee466ba4f06773553ac2fafd2a227a39ab185c7e3c6a6e4c02: Status 404 returned error can't find the container with id ab799289adfbe6ee466ba4f06773553ac2fafd2a227a39ab185c7e3c6a6e4c02 Apr 16 16:48:39.249855 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.249841 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:48:39.272084 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.272047 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" event={"ID":"f1bc04e409baab07763d6ca236ceaf1a","Type":"ContainerStarted","Data":"ab799289adfbe6ee466ba4f06773553ac2fafd2a227a39ab185c7e3c6a6e4c02"} Apr 16 16:48:39.272973 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.272952 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" event={"ID":"1e831993780eeb4b1e48fbb6fdbf0f02","Type":"ContainerStarted","Data":"edf5e6fa971907d2934bca7adafcd9160ccfd3f4ad2d157132a6908ff01335df"} Apr 16 16:48:39.326131 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:39.326103 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:39.426609 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:39.426571 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 16 16:48:39.476033 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.475956 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:48:39.477025 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.477000 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:48:39.527457 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.527428 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 16 16:48:39.542569 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.542545 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:48:39.543414 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.543399 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" Apr 16 16:48:39.552449 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:39.552431 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:48:40.041278 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.041251 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:48:40.108750 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.108717 2574 apiserver.go:52] "Watching apiserver" Apr 16 16:48:40.117892 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.117865 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:48:40.118795 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.118768 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-89n6j","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb","openshift-cluster-node-tuning-operator/tuned-wqv8s","openshift-multus/multus-additional-cni-plugins-72gts","openshift-multus/network-metrics-daemon-62nv6","openshift-network-diagnostics/network-check-target-cqm4l","openshift-ovn-kubernetes/ovnkube-node-2fwqq","kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal","openshift-dns/node-resolver-bmj4j","openshift-image-registry/node-ca-jndw7","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal","openshift-multus/multus-7dxf7","openshift-network-operator/iptables-alerter-zjk6l"] Apr 16 16:48:40.121423 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.121404 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.126107 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.125998 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:48:40.126107 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.126010 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:48:40.126107 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.126037 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:48:40.126107 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.126045 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2sb4n\"" Apr 16 16:48:40.126107 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.126103 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:48:40.126394 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.126248 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:48:40.126394 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.126286 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.126394 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.126364 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.126592 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.126570 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:48:40.128344 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.128326 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.129047 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.129028 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:48:40.129152 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.129033 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:48:40.129302 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.129288 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:48:40.129370 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.129312 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:48:40.130398 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.130350 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-56j9w\"" Apr 16 16:48:40.130466 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.130396 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:48:40.130563 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.130545 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6d8vz\"" Apr 16 16:48:40.130921 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.130767 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:40.130921 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.130872 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:48:40.131115 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.131088 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:48:40.131536 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.131428 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:48:40.131536 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.131428 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:48:40.131697 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.131491 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:48:40.131697 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.131673 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:48:40.131825 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.131807 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nxv7k\"" Apr 16 16:48:40.133091 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.133076 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:40.133175 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.133140 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:48:40.135683 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.135667 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-89n6j" Apr 16 16:48:40.136713 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.136693 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-run-netns\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.136815 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.136745 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-cni-bin\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.136815 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.136776 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-etc-selinux\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.136921 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.136826 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-systemd\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.136921 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.136853 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ca8cf77-0136-4fcd-be78-19762c126a37-tmp\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.136921 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.136879 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-system-cni-dir\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.136921 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.136904 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-os-release\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.137126 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.136950 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-kubelet\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.137126 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.136975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-sysconfig\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.137126 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137002 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-kubernetes\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.137126 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137035 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-sysctl-d\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.137126 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137058 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnzhs\" (UniqueName: \"kubernetes.io/projected/6ca8cf77-0136-4fcd-be78-19762c126a37-kube-api-access-wnzhs\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.137126 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137085 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.137126 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137108 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-device-dir\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.137422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137130 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-sys\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.137422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137152 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-lib-modules\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.137422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137174 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-var-lib-kubelet\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.137422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137199 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.137422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137223 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx5hj\" (UniqueName: \"kubernetes.io/projected/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-kube-api-access-fx5hj\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.137422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137245 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs\") pod \"network-metrics-daemon-62nv6\" (UID: \"7c7749f8-7b64-4062-9bca-90c0826a9692\") " pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:40.137422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137268 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3db0e84-d73d-4e5b-a7c2-94290b442748-ovnkube-script-lib\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.137422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137292 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-run-systemd\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.137422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-var-lib-openvswitch\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.137422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137337 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-tuned\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.137422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137369 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-cni-binary-copy\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.137422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137398 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-tuning-conf-dir\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.137422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137423 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt76h\" (UniqueName: \"kubernetes.io/projected/7c7749f8-7b64-4062-9bca-90c0826a9692-kube-api-access-vt76h\") pod \"network-metrics-daemon-62nv6\" (UID: \"7c7749f8-7b64-4062-9bca-90c0826a9692\") " pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137555 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-modprobe-d\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137584 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-sysctl-conf\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137610 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-host\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137638 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-run-openvswitch\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137682 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3db0e84-d73d-4e5b-a7c2-94290b442748-ovnkube-config\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137711 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x44rc\" (UniqueName: \"kubernetes.io/projected/a3db0e84-d73d-4e5b-a7c2-94290b442748-kube-api-access-x44rc\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137736 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-socket-dir\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-run\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137783 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-slash\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137803 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-etc-openvswitch\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137826 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-run-ovn\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137849 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-log-socket\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137872 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-registration-dir\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137895 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjhq8\" (UniqueName: \"kubernetes.io/projected/292362c1-e737-4b91-ac69-fab25c8f024f-kube-api-access-rjhq8\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.137962 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-cnibin\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.138630 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137960 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.138630 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.137980 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-systemd-units\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.138630 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.138027 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-node-log\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.138630 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.138057 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-run-ovn-kubernetes\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.138630 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.138081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-cni-netd\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.138630 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.138104 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3db0e84-d73d-4e5b-a7c2-94290b442748-env-overrides\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.138630 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.138127 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3db0e84-d73d-4e5b-a7c2-94290b442748-ovn-node-metrics-cert\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.138630 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.138159 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-sys-fs\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.138630 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.138336 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:48:40.138630 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.138341 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cnlvd\"" Apr 16 16:48:40.138630 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.138571 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:48:40.139169 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.138666 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bmj4j" Apr 16 16:48:40.140527 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.140509 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jndw7" Apr 16 16:48:40.141542 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.141524 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:48:40.141542 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.141539 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:48:40.141706 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.141522 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-9m6xk\"" Apr 16 16:48:40.143415 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.143399 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6pxz4\"" Apr 16 16:48:40.143643 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.143621 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:48:40.143643 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.143643 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:48:40.143807 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.143682 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:48:40.146632 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.146611 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zjk6l" Apr 16 16:48:40.146777 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.146704 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.149908 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.149867 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:48:40.149998 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.149983 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:48:40.150061 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.150014 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:48:40.150061 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.150028 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-m6m8m\"" Apr 16 16:48:40.150159 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.150031 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kjxrk\"" Apr 16 16:48:40.150159 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.150121 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:48:40.163971 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.163946 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:43:39 +0000 UTC" deadline="2027-09-29 07:30:14.939511712 +0000 UTC" Apr 16 16:48:40.164075 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.163974 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12734h41m34.775542111s" Apr 16 16:48:40.229448 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.229424 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:48:40.238558 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-run-openvswitch\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.238712 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238577 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3db0e84-d73d-4e5b-a7c2-94290b442748-ovnkube-config\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.238712 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-socket-dir\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.238712 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-run-openvswitch\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.238712 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238634 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ef20d3fb-2f3a-4af7-97e4-df2e2773f314-konnectivity-ca\") pod \"konnectivity-agent-89n6j\" (UID: \"ef20d3fb-2f3a-4af7-97e4-df2e2773f314\") " pod="kube-system/konnectivity-agent-89n6j" Apr 16 16:48:40.238712 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gcjh\" (UniqueName: \"kubernetes.io/projected/131ec79c-6a30-4f10-b4b5-c529479f0fe0-kube-api-access-5gcjh\") pod \"node-resolver-bmj4j\" (UID: \"131ec79c-6a30-4f10-b4b5-c529479f0fe0\") " pod="openshift-dns/node-resolver-bmj4j" Apr 16 16:48:40.238712 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238704 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-etc-kubernetes\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238728 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-slash\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-run-ovn\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238771 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-cnibin\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238798 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjbv9\" (UniqueName: \"kubernetes.io/projected/c7b5620d-256f-4b38-ac42-4979da7007a4-kube-api-access-tjbv9\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-socket-dir\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238823 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4ws9\" (UniqueName: \"kubernetes.io/projected/75e2551e-500a-44ad-90a9-c6ee9b976f48-kube-api-access-k4ws9\") pod \"iptables-alerter-zjk6l\" (UID: \"75e2551e-500a-44ad-90a9-c6ee9b976f48\") " pod="openshift-network-operator/iptables-alerter-zjk6l" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238849 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238861 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-slash\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-node-log\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238920 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-run-ovn-kubernetes\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238964 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-run-ovn\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.238997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-cnibin\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.239016 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239013 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-node-log\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239030 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3db0e84-d73d-4e5b-a7c2-94290b442748-ovn-node-metrics-cert\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239045 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-run-ovn-kubernetes\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-systemd\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239093 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ca8cf77-0136-4fcd-be78-19762c126a37-tmp\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239117 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/131ec79c-6a30-4f10-b4b5-c529479f0fe0-tmp-dir\") pod \"node-resolver-bmj4j\" (UID: \"131ec79c-6a30-4f10-b4b5-c529479f0fe0\") " pod="openshift-dns/node-resolver-bmj4j" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239139 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-multus-socket-dir-parent\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239197 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-run-netns\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239220 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3db0e84-d73d-4e5b-a7c2-94290b442748-ovnkube-config\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239221 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-cni-bin\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239267 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-os-release\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239295 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-run-k8s-cni-cncf-io\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c7b5620d-256f-4b38-ac42-4979da7007a4-multus-daemon-config\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239387 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/75e2551e-500a-44ad-90a9-c6ee9b976f48-host-slash\") pod \"iptables-alerter-zjk6l\" (UID: \"75e2551e-500a-44ad-90a9-c6ee9b976f48\") " pod="openshift-network-operator/iptables-alerter-zjk6l" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239421 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-cni-bin\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239462 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-kubelet\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.239676 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239489 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-sysconfig\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239486 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-os-release\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239518 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-run-netns\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239557 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-sysctl-d\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239560 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-systemd\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239593 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-multus-cni-dir\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239602 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-sysconfig\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239713 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-kubelet\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239728 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-sysctl-d\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239759 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-device-dir\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239804 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-device-dir\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239830 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-sys\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239857 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-var-lib-cni-multus\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239882 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-sys\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239883 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/75e2551e-500a-44ad-90a9-c6ee9b976f48-iptables-alerter-script\") pod \"iptables-alerter-zjk6l\" (UID: \"75e2551e-500a-44ad-90a9-c6ee9b976f48\") " pod="openshift-network-operator/iptables-alerter-zjk6l" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239920 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3db0e84-d73d-4e5b-a7c2-94290b442748-ovnkube-script-lib\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-run-systemd\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.240439 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239966 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-tuned\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.239996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-cni-binary-copy\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240016 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-run-systemd\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-tuning-conf-dir\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240049 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-modprobe-d\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240072 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-sysctl-conf\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240095 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x44rc\" (UniqueName: \"kubernetes.io/projected/a3db0e84-d73d-4e5b-a7c2-94290b442748-kube-api-access-x44rc\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240116 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-run\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240143 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ef20d3fb-2f3a-4af7-97e4-df2e2773f314-agent-certs\") pod \"konnectivity-agent-89n6j\" (UID: \"ef20d3fb-2f3a-4af7-97e4-df2e2773f314\") " pod="kube-system/konnectivity-agent-89n6j" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240164 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a0b88db-3420-4832-821b-dc3272b66858-host\") pod \"node-ca-jndw7\" (UID: \"3a0b88db-3420-4832-821b-dc3272b66858\") " pod="openshift-image-registry/node-ca-jndw7" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240187 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-etc-openvswitch\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240209 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-log-socket\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240232 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-registration-dir\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240257 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjhq8\" (UniqueName: \"kubernetes.io/projected/292362c1-e737-4b91-ac69-fab25c8f024f-kube-api-access-rjhq8\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240283 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/131ec79c-6a30-4f10-b4b5-c529479f0fe0-hosts-file\") pod \"node-resolver-bmj4j\" (UID: \"131ec79c-6a30-4f10-b4b5-c529479f0fe0\") " pod="openshift-dns/node-resolver-bmj4j" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240307 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmdtc\" (UniqueName: \"kubernetes.io/projected/3a0b88db-3420-4832-821b-dc3272b66858-kube-api-access-rmdtc\") pod \"node-ca-jndw7\" (UID: \"3a0b88db-3420-4832-821b-dc3272b66858\") " pod="openshift-image-registry/node-ca-jndw7" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240332 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-run-netns\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.241247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240356 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-systemd-units\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240381 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-cni-netd\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3db0e84-d73d-4e5b-a7c2-94290b442748-env-overrides\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240412 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3db0e84-d73d-4e5b-a7c2-94290b442748-ovnkube-script-lib\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240432 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-sys-fs\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240459 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240486 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fx5hj\" (UniqueName: \"kubernetes.io/projected/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-kube-api-access-fx5hj\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240513 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs\") pod \"network-metrics-daemon-62nv6\" (UID: \"7c7749f8-7b64-4062-9bca-90c0826a9692\") " pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240538 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-var-lib-cni-bin\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240564 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-etc-selinux\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240591 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-system-cni-dir\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240606 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-registration-dir\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240616 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-cnibin\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240643 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-hostroot\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240690 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-kubernetes\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240702 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-modprobe-d\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240719 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnzhs\" (UniqueName: \"kubernetes.io/projected/6ca8cf77-0136-4fcd-be78-19762c126a37-kube-api-access-wnzhs\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.241994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240744 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3a0b88db-3420-4832-821b-dc3272b66858-serviceca\") pod \"node-ca-jndw7\" (UID: \"3a0b88db-3420-4832-821b-dc3272b66858\") " pod="openshift-image-registry/node-ca-jndw7" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-os-release\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-sysctl-conf\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240796 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240826 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-lib-modules\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-var-lib-kubelet\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240882 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-run-multus-certs\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240912 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-var-lib-openvswitch\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240968 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt76h\" (UniqueName: \"kubernetes.io/projected/7c7749f8-7b64-4062-9bca-90c0826a9692-kube-api-access-vt76h\") pod \"network-metrics-daemon-62nv6\" (UID: \"7c7749f8-7b64-4062-9bca-90c0826a9692\") " pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.240996 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-multus-conf-dir\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241032 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsw8x\" (UniqueName: \"kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x\") pod \"network-check-target-cqm4l\" (UID: \"0b5fcafd-70a7-4e76-ba7a-022cfee37811\") " pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241044 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-cni-binary-copy\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241060 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-host\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241091 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-run\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241104 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-system-cni-dir\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241140 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7b5620d-256f-4b38-ac42-4979da7007a4-cni-binary-copy\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.242740 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241193 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-log-socket\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241201 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-var-lib-kubelet\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241237 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-kubernetes\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241248 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-var-lib-openvswitch\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241317 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-systemd-units\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241358 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-cni-netd\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241429 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-var-lib-kubelet\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241545 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3db0e84-d73d-4e5b-a7c2-94290b442748-etc-openvswitch\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241620 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-lib-modules\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241694 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241707 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3db0e84-d73d-4e5b-a7c2-94290b442748-env-overrides\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241786 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-sys-fs\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.241808 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241875 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-system-cni-dir\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.241966 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ca8cf77-0136-4fcd-be78-19762c126a37-host\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.241999 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs podName:7c7749f8-7b64-4062-9bca-90c0826a9692 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:40.741975494 +0000 UTC m=+3.025001068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs") pod "network-metrics-daemon-62nv6" (UID: "7c7749f8-7b64-4062-9bca-90c0826a9692") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:40.243389 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.242016 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/292362c1-e737-4b91-ac69-fab25c8f024f-etc-selinux\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.244141 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.242258 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-tuning-conf-dir\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.244141 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.242318 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.244141 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.243210 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ca8cf77-0136-4fcd-be78-19762c126a37-tmp\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.244141 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.243566 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3db0e84-d73d-4e5b-a7c2-94290b442748-ovn-node-metrics-cert\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.245362 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.245317 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6ca8cf77-0136-4fcd-be78-19762c126a37-etc-tuned\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.250626 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.250455 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnzhs\" (UniqueName: \"kubernetes.io/projected/6ca8cf77-0136-4fcd-be78-19762c126a37-kube-api-access-wnzhs\") pod \"tuned-wqv8s\" (UID: \"6ca8cf77-0136-4fcd-be78-19762c126a37\") " pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.250626 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.250553 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjhq8\" (UniqueName: \"kubernetes.io/projected/292362c1-e737-4b91-ac69-fab25c8f024f-kube-api-access-rjhq8\") pod \"aws-ebs-csi-driver-node-sgrsb\" (UID: \"292362c1-e737-4b91-ac69-fab25c8f024f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.251023 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.251004 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x44rc\" (UniqueName: \"kubernetes.io/projected/a3db0e84-d73d-4e5b-a7c2-94290b442748-kube-api-access-x44rc\") pod \"ovnkube-node-2fwqq\" (UID: \"a3db0e84-d73d-4e5b-a7c2-94290b442748\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.252054 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.252033 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx5hj\" (UniqueName: \"kubernetes.io/projected/fabe57c2-0e2b-42e1-9322-9ea7c5a3f719-kube-api-access-fx5hj\") pod \"multus-additional-cni-plugins-72gts\" (UID: \"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719\") " pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.252541 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.252520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt76h\" (UniqueName: \"kubernetes.io/projected/7c7749f8-7b64-4062-9bca-90c0826a9692-kube-api-access-vt76h\") pod \"network-metrics-daemon-62nv6\" (UID: \"7c7749f8-7b64-4062-9bca-90c0826a9692\") " pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:40.295199 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.295107 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:48:40.341700 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-multus-cni-dir\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.341827 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-var-lib-cni-multus\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.341827 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341723 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-multus-cni-dir\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.341827 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/75e2551e-500a-44ad-90a9-c6ee9b976f48-iptables-alerter-script\") pod \"iptables-alerter-zjk6l\" (UID: \"75e2551e-500a-44ad-90a9-c6ee9b976f48\") " pod="openshift-network-operator/iptables-alerter-zjk6l" Apr 16 16:48:40.341827 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ef20d3fb-2f3a-4af7-97e4-df2e2773f314-agent-certs\") pod \"konnectivity-agent-89n6j\" (UID: \"ef20d3fb-2f3a-4af7-97e4-df2e2773f314\") " pod="kube-system/konnectivity-agent-89n6j" Apr 16 16:48:40.341827 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341789 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a0b88db-3420-4832-821b-dc3272b66858-host\") pod \"node-ca-jndw7\" (UID: \"3a0b88db-3420-4832-821b-dc3272b66858\") " pod="openshift-image-registry/node-ca-jndw7" Apr 16 16:48:40.341827 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341809 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/131ec79c-6a30-4f10-b4b5-c529479f0fe0-hosts-file\") pod \"node-resolver-bmj4j\" (UID: \"131ec79c-6a30-4f10-b4b5-c529479f0fe0\") " pod="openshift-dns/node-resolver-bmj4j" Apr 16 16:48:40.341827 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-var-lib-cni-multus\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.341827 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341824 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmdtc\" (UniqueName: \"kubernetes.io/projected/3a0b88db-3420-4832-821b-dc3272b66858-kube-api-access-rmdtc\") pod \"node-ca-jndw7\" (UID: \"3a0b88db-3420-4832-821b-dc3272b66858\") " pod="openshift-image-registry/node-ca-jndw7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341841 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-run-netns\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a0b88db-3420-4832-821b-dc3272b66858-host\") pod \"node-ca-jndw7\" (UID: \"3a0b88db-3420-4832-821b-dc3272b66858\") " pod="openshift-image-registry/node-ca-jndw7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-var-lib-cni-bin\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341875 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/131ec79c-6a30-4f10-b4b5-c529479f0fe0-hosts-file\") pod \"node-resolver-bmj4j\" (UID: \"131ec79c-6a30-4f10-b4b5-c529479f0fe0\") " pod="openshift-dns/node-resolver-bmj4j" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341906 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-var-lib-cni-bin\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-cnibin\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341942 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-hostroot\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341953 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-cnibin\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341917 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-run-netns\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341969 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3a0b88db-3420-4832-821b-dc3272b66858-serviceca\") pod \"node-ca-jndw7\" (UID: \"3a0b88db-3420-4832-821b-dc3272b66858\") " pod="openshift-image-registry/node-ca-jndw7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-os-release\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342021 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-run-multus-certs\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.341997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-hostroot\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342048 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-multus-conf-dir\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342092 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-run-multus-certs\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342097 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-multus-conf-dir\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342112 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-os-release\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsw8x\" (UniqueName: \"kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x\") pod \"network-check-target-cqm4l\" (UID: \"0b5fcafd-70a7-4e76-ba7a-022cfee37811\") " pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:40.342182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342152 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-system-cni-dir\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342176 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7b5620d-256f-4b38-ac42-4979da7007a4-cni-binary-copy\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342194 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/75e2551e-500a-44ad-90a9-c6ee9b976f48-iptables-alerter-script\") pod \"iptables-alerter-zjk6l\" (UID: \"75e2551e-500a-44ad-90a9-c6ee9b976f48\") " pod="openshift-network-operator/iptables-alerter-zjk6l" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-var-lib-kubelet\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ef20d3fb-2f3a-4af7-97e4-df2e2773f314-konnectivity-ca\") pod \"konnectivity-agent-89n6j\" (UID: \"ef20d3fb-2f3a-4af7-97e4-df2e2773f314\") " pod="kube-system/konnectivity-agent-89n6j" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342240 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-system-cni-dir\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342254 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gcjh\" (UniqueName: \"kubernetes.io/projected/131ec79c-6a30-4f10-b4b5-c529479f0fe0-kube-api-access-5gcjh\") pod \"node-resolver-bmj4j\" (UID: \"131ec79c-6a30-4f10-b4b5-c529479f0fe0\") " pod="openshift-dns/node-resolver-bmj4j" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342279 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-etc-kubernetes\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-var-lib-kubelet\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342305 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjbv9\" (UniqueName: \"kubernetes.io/projected/c7b5620d-256f-4b38-ac42-4979da7007a4-kube-api-access-tjbv9\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342352 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-etc-kubernetes\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342439 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4ws9\" (UniqueName: \"kubernetes.io/projected/75e2551e-500a-44ad-90a9-c6ee9b976f48-kube-api-access-k4ws9\") pod \"iptables-alerter-zjk6l\" (UID: \"75e2551e-500a-44ad-90a9-c6ee9b976f48\") " pod="openshift-network-operator/iptables-alerter-zjk6l" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342477 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/131ec79c-6a30-4f10-b4b5-c529479f0fe0-tmp-dir\") pod \"node-resolver-bmj4j\" (UID: \"131ec79c-6a30-4f10-b4b5-c529479f0fe0\") " pod="openshift-dns/node-resolver-bmj4j" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-multus-socket-dir-parent\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-run-k8s-cni-cncf-io\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342529 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3a0b88db-3420-4832-821b-dc3272b66858-serviceca\") pod \"node-ca-jndw7\" (UID: \"3a0b88db-3420-4832-821b-dc3272b66858\") " pod="openshift-image-registry/node-ca-jndw7" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c7b5620d-256f-4b38-ac42-4979da7007a4-multus-daemon-config\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/75e2551e-500a-44ad-90a9-c6ee9b976f48-host-slash\") pod \"iptables-alerter-zjk6l\" (UID: \"75e2551e-500a-44ad-90a9-c6ee9b976f48\") " pod="openshift-network-operator/iptables-alerter-zjk6l" Apr 16 16:48:40.343119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342594 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-multus-socket-dir-parent\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.344111 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342635 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/75e2551e-500a-44ad-90a9-c6ee9b976f48-host-slash\") pod \"iptables-alerter-zjk6l\" (UID: \"75e2551e-500a-44ad-90a9-c6ee9b976f48\") " pod="openshift-network-operator/iptables-alerter-zjk6l" Apr 16 16:48:40.344111 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342721 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c7b5620d-256f-4b38-ac42-4979da7007a4-host-run-k8s-cni-cncf-io\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.344111 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7b5620d-256f-4b38-ac42-4979da7007a4-cni-binary-copy\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.344111 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ef20d3fb-2f3a-4af7-97e4-df2e2773f314-konnectivity-ca\") pod \"konnectivity-agent-89n6j\" (UID: \"ef20d3fb-2f3a-4af7-97e4-df2e2773f314\") " pod="kube-system/konnectivity-agent-89n6j" Apr 16 16:48:40.344111 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342809 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/131ec79c-6a30-4f10-b4b5-c529479f0fe0-tmp-dir\") pod \"node-resolver-bmj4j\" (UID: \"131ec79c-6a30-4f10-b4b5-c529479f0fe0\") " pod="openshift-dns/node-resolver-bmj4j" Apr 16 16:48:40.344111 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.342972 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c7b5620d-256f-4b38-ac42-4979da7007a4-multus-daemon-config\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.344449 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.344434 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ef20d3fb-2f3a-4af7-97e4-df2e2773f314-agent-certs\") pod \"konnectivity-agent-89n6j\" (UID: \"ef20d3fb-2f3a-4af7-97e4-df2e2773f314\") " pod="kube-system/konnectivity-agent-89n6j" Apr 16 16:48:40.351679 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.351636 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:40.351679 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.351676 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:40.351844 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.351690 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hsw8x for pod openshift-network-diagnostics/network-check-target-cqm4l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:40.351844 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.351759 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x podName:0b5fcafd-70a7-4e76-ba7a-022cfee37811 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:40.851740607 +0000 UTC m=+3.134766167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hsw8x" (UniqueName: "kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x") pod "network-check-target-cqm4l" (UID: "0b5fcafd-70a7-4e76-ba7a-022cfee37811") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:40.353857 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.353835 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjbv9\" (UniqueName: \"kubernetes.io/projected/c7b5620d-256f-4b38-ac42-4979da7007a4-kube-api-access-tjbv9\") pod \"multus-7dxf7\" (UID: \"c7b5620d-256f-4b38-ac42-4979da7007a4\") " pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.354058 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.354034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4ws9\" (UniqueName: \"kubernetes.io/projected/75e2551e-500a-44ad-90a9-c6ee9b976f48-kube-api-access-k4ws9\") pod \"iptables-alerter-zjk6l\" (UID: \"75e2551e-500a-44ad-90a9-c6ee9b976f48\") " pod="openshift-network-operator/iptables-alerter-zjk6l" Apr 16 16:48:40.354596 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.354579 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gcjh\" (UniqueName: \"kubernetes.io/projected/131ec79c-6a30-4f10-b4b5-c529479f0fe0-kube-api-access-5gcjh\") pod \"node-resolver-bmj4j\" (UID: \"131ec79c-6a30-4f10-b4b5-c529479f0fe0\") " pod="openshift-dns/node-resolver-bmj4j" Apr 16 16:48:40.354809 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.354795 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmdtc\" (UniqueName: \"kubernetes.io/projected/3a0b88db-3420-4832-821b-dc3272b66858-kube-api-access-rmdtc\") pod \"node-ca-jndw7\" (UID: \"3a0b88db-3420-4832-821b-dc3272b66858\") " pod="openshift-image-registry/node-ca-jndw7" Apr 16 16:48:40.434400 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.434367 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:48:40.442166 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.442143 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" Apr 16 16:48:40.451148 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.451123 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" Apr 16 16:48:40.456728 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.456708 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-72gts" Apr 16 16:48:40.463274 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.463255 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-89n6j" Apr 16 16:48:40.470795 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.470777 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bmj4j" Apr 16 16:48:40.478313 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.478296 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jndw7" Apr 16 16:48:40.485800 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.485784 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zjk6l" Apr 16 16:48:40.491339 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.491323 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7dxf7" Apr 16 16:48:40.744942 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.744854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs\") pod \"network-metrics-daemon-62nv6\" (UID: \"7c7749f8-7b64-4062-9bca-90c0826a9692\") " pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:40.745087 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.745008 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:40.745137 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.745093 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs podName:7c7749f8-7b64-4062-9bca-90c0826a9692 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:41.745076957 +0000 UTC m=+4.028102511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs") pod "network-metrics-daemon-62nv6" (UID: "7c7749f8-7b64-4062-9bca-90c0826a9692") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:40.885686 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:40.885640 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75e2551e_500a_44ad_90a9_c6ee9b976f48.slice/crio-1be28b27f33ffc993435b86c5ee17609958bc96772355de984fdebffe98490d0 WatchSource:0}: Error finding container 1be28b27f33ffc993435b86c5ee17609958bc96772355de984fdebffe98490d0: Status 404 returned error can't find the container with id 1be28b27f33ffc993435b86c5ee17609958bc96772355de984fdebffe98490d0 Apr 16 16:48:40.886942 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:40.886919 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod292362c1_e737_4b91_ac69_fab25c8f024f.slice/crio-1533c72148a5d7c64ca646c5d9c9316aa2f426a80dec00e6333c3b343d78b6e2 WatchSource:0}: Error finding container 1533c72148a5d7c64ca646c5d9c9316aa2f426a80dec00e6333c3b343d78b6e2: Status 404 returned error can't find the container with id 1533c72148a5d7c64ca646c5d9c9316aa2f426a80dec00e6333c3b343d78b6e2 Apr 16 16:48:40.887864 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:40.887824 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7b5620d_256f_4b38_ac42_4979da7007a4.slice/crio-11f026f2ef9c385562abc0abb6be7a32538b6bd40e5ebdfb0a03d9927222ad92 WatchSource:0}: Error finding container 11f026f2ef9c385562abc0abb6be7a32538b6bd40e5ebdfb0a03d9927222ad92: Status 404 returned error can't find the container with id 11f026f2ef9c385562abc0abb6be7a32538b6bd40e5ebdfb0a03d9927222ad92 Apr 16 16:48:40.891256 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:40.891237 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ca8cf77_0136_4fcd_be78_19762c126a37.slice/crio-c93002a9594e15b1e7ca990774b5efecd18ec9eec09a317c7c21166c257e309d WatchSource:0}: Error finding container c93002a9594e15b1e7ca990774b5efecd18ec9eec09a317c7c21166c257e309d: Status 404 returned error can't find the container with id c93002a9594e15b1e7ca990774b5efecd18ec9eec09a317c7c21166c257e309d Apr 16 16:48:40.895627 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:40.895603 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfabe57c2_0e2b_42e1_9322_9ea7c5a3f719.slice/crio-cc9cf98ff5c5c552acd54f17363eb50e518ef9ae0bded5baa665c3f2b1c93f1f WatchSource:0}: Error finding container cc9cf98ff5c5c552acd54f17363eb50e518ef9ae0bded5baa665c3f2b1c93f1f: Status 404 returned error can't find the container with id cc9cf98ff5c5c552acd54f17363eb50e518ef9ae0bded5baa665c3f2b1c93f1f Apr 16 16:48:40.896213 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:40.896181 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a0b88db_3420_4832_821b_dc3272b66858.slice/crio-e463f5d77814b4c13b062bf3576ab81a8280b9c15a635cf5badb370a7104c5ae WatchSource:0}: Error finding container e463f5d77814b4c13b062bf3576ab81a8280b9c15a635cf5badb370a7104c5ae: Status 404 returned error can't find the container with id e463f5d77814b4c13b062bf3576ab81a8280b9c15a635cf5badb370a7104c5ae Apr 16 16:48:40.897104 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:40.897081 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod131ec79c_6a30_4f10_b4b5_c529479f0fe0.slice/crio-5d0bab0f7c5ac958daa3e2610ba212a65a101503a6ca87ce08b506ed59f32825 WatchSource:0}: Error finding container 5d0bab0f7c5ac958daa3e2610ba212a65a101503a6ca87ce08b506ed59f32825: Status 404 returned error can't find the container with id 5d0bab0f7c5ac958daa3e2610ba212a65a101503a6ca87ce08b506ed59f32825 Apr 16 16:48:40.919759 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:40.919738 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3db0e84_d73d_4e5b_a7c2_94290b442748.slice/crio-77da82d4b4bdee5a7e75bbf82c33afdc4280ca6806101946bf42dbf6ad9ceb11 WatchSource:0}: Error finding container 77da82d4b4bdee5a7e75bbf82c33afdc4280ca6806101946bf42dbf6ad9ceb11: Status 404 returned error can't find the container with id 77da82d4b4bdee5a7e75bbf82c33afdc4280ca6806101946bf42dbf6ad9ceb11 Apr 16 16:48:40.920719 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:48:40.920700 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef20d3fb_2f3a_4af7_97e4_df2e2773f314.slice/crio-2ad3e1a130911eb9f1df3cb842987a70add8029e1fa6be2c49c270639d15c735 WatchSource:0}: Error finding container 2ad3e1a130911eb9f1df3cb842987a70add8029e1fa6be2c49c270639d15c735: Status 404 returned error can't find the container with id 2ad3e1a130911eb9f1df3cb842987a70add8029e1fa6be2c49c270639d15c735 Apr 16 16:48:40.946356 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:40.946336 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsw8x\" (UniqueName: \"kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x\") pod \"network-check-target-cqm4l\" (UID: \"0b5fcafd-70a7-4e76-ba7a-022cfee37811\") " pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:40.946456 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.946445 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:40.946492 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.946459 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:40.946492 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.946468 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hsw8x for pod openshift-network-diagnostics/network-check-target-cqm4l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:40.946575 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:40.946506 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x podName:0b5fcafd-70a7-4e76-ba7a-022cfee37811 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:41.946493901 +0000 UTC m=+4.229519456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hsw8x" (UniqueName: "kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x") pod "network-check-target-cqm4l" (UID: "0b5fcafd-70a7-4e76-ba7a-022cfee37811") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:41.164409 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.164365 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:43:39 +0000 UTC" deadline="2027-09-25 01:20:14.498447512 +0000 UTC" Apr 16 16:48:41.164409 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.164403 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12632h31m33.33404722s" Apr 16 16:48:41.280085 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.280000 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7dxf7" event={"ID":"c7b5620d-256f-4b38-ac42-4979da7007a4","Type":"ContainerStarted","Data":"11f026f2ef9c385562abc0abb6be7a32538b6bd40e5ebdfb0a03d9927222ad92"} Apr 16 16:48:41.282097 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.282065 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zjk6l" event={"ID":"75e2551e-500a-44ad-90a9-c6ee9b976f48","Type":"ContainerStarted","Data":"1be28b27f33ffc993435b86c5ee17609958bc96772355de984fdebffe98490d0"} Apr 16 16:48:41.284600 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.284571 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" event={"ID":"f1bc04e409baab07763d6ca236ceaf1a","Type":"ContainerStarted","Data":"8c570e5832a2741d2cb34bf902bf210f26eb130b5de8587c2f182037a6567728"} Apr 16 16:48:41.286970 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.286928 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-89n6j" event={"ID":"ef20d3fb-2f3a-4af7-97e4-df2e2773f314","Type":"ContainerStarted","Data":"2ad3e1a130911eb9f1df3cb842987a70add8029e1fa6be2c49c270639d15c735"} Apr 16 16:48:41.291682 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.291619 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" event={"ID":"a3db0e84-d73d-4e5b-a7c2-94290b442748","Type":"ContainerStarted","Data":"77da82d4b4bdee5a7e75bbf82c33afdc4280ca6806101946bf42dbf6ad9ceb11"} Apr 16 16:48:41.294094 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.293681 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bmj4j" event={"ID":"131ec79c-6a30-4f10-b4b5-c529479f0fe0","Type":"ContainerStarted","Data":"5d0bab0f7c5ac958daa3e2610ba212a65a101503a6ca87ce08b506ed59f32825"} Apr 16 16:48:41.295244 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.295218 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" event={"ID":"6ca8cf77-0136-4fcd-be78-19762c126a37","Type":"ContainerStarted","Data":"c93002a9594e15b1e7ca990774b5efecd18ec9eec09a317c7c21166c257e309d"} Apr 16 16:48:41.297148 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.296970 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72gts" event={"ID":"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719","Type":"ContainerStarted","Data":"cc9cf98ff5c5c552acd54f17363eb50e518ef9ae0bded5baa665c3f2b1c93f1f"} Apr 16 16:48:41.301140 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.301115 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" event={"ID":"292362c1-e737-4b91-ac69-fab25c8f024f","Type":"ContainerStarted","Data":"1533c72148a5d7c64ca646c5d9c9316aa2f426a80dec00e6333c3b343d78b6e2"} Apr 16 16:48:41.304539 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.304515 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jndw7" event={"ID":"3a0b88db-3420-4832-821b-dc3272b66858","Type":"ContainerStarted","Data":"e463f5d77814b4c13b062bf3576ab81a8280b9c15a635cf5badb370a7104c5ae"} Apr 16 16:48:41.752145 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.752107 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs\") pod \"network-metrics-daemon-62nv6\" (UID: \"7c7749f8-7b64-4062-9bca-90c0826a9692\") " pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:41.752354 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:41.752337 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:41.752421 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:41.752405 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs podName:7c7749f8-7b64-4062-9bca-90c0826a9692 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:43.752387678 +0000 UTC m=+6.035413238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs") pod "network-metrics-daemon-62nv6" (UID: "7c7749f8-7b64-4062-9bca-90c0826a9692") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:41.954571 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:41.953884 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsw8x\" (UniqueName: \"kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x\") pod \"network-check-target-cqm4l\" (UID: \"0b5fcafd-70a7-4e76-ba7a-022cfee37811\") " pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:41.954571 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:41.954078 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:41.954571 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:41.954098 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:41.954571 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:41.954111 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hsw8x for pod openshift-network-diagnostics/network-check-target-cqm4l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:41.954571 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:41.954168 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x podName:0b5fcafd-70a7-4e76-ba7a-022cfee37811 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:43.954149792 +0000 UTC m=+6.237175351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hsw8x" (UniqueName: "kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x") pod "network-check-target-cqm4l" (UID: "0b5fcafd-70a7-4e76-ba7a-022cfee37811") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:42.271406 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:42.270815 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:42.271406 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:42.270950 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:48:42.272156 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:42.271762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:42.272156 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:42.271857 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:48:42.320667 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:42.320609 2574 generic.go:358] "Generic (PLEG): container finished" podID="1e831993780eeb4b1e48fbb6fdbf0f02" containerID="262f429843fa1a7380314cf6d9a5387f4d637129d2284f59b9655d886c15103e" exitCode=0 Apr 16 16:48:42.321469 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:42.321445 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" event={"ID":"1e831993780eeb4b1e48fbb6fdbf0f02","Type":"ContainerDied","Data":"262f429843fa1a7380314cf6d9a5387f4d637129d2284f59b9655d886c15103e"} Apr 16 16:48:42.338726 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:42.338678 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" podStartSLOduration=3.338646388 podStartE2EDuration="3.338646388s" podCreationTimestamp="2026-04-16 16:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:48:41.301253117 +0000 UTC m=+3.584278695" watchObservedRunningTime="2026-04-16 16:48:42.338646388 +0000 UTC m=+4.621671966" Apr 16 16:48:43.336086 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:43.336052 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" event={"ID":"1e831993780eeb4b1e48fbb6fdbf0f02","Type":"ContainerStarted","Data":"5c7d67d2392ca35b84be0b2f6e54e622aaaceef33763570439a9aabfcadd96fc"} Apr 16 16:48:43.767909 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:43.767872 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs\") pod \"network-metrics-daemon-62nv6\" (UID: \"7c7749f8-7b64-4062-9bca-90c0826a9692\") " pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:43.768083 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:43.768063 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:43.768142 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:43.768126 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs podName:7c7749f8-7b64-4062-9bca-90c0826a9692 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:47.76810876 +0000 UTC m=+10.051134328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs") pod "network-metrics-daemon-62nv6" (UID: "7c7749f8-7b64-4062-9bca-90c0826a9692") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:43.969566 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:43.969497 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsw8x\" (UniqueName: \"kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x\") pod \"network-check-target-cqm4l\" (UID: \"0b5fcafd-70a7-4e76-ba7a-022cfee37811\") " pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:43.969790 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:43.969668 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:43.969790 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:43.969694 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:43.969790 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:43.969707 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hsw8x for pod openshift-network-diagnostics/network-check-target-cqm4l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:43.969790 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:43.969773 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x podName:0b5fcafd-70a7-4e76-ba7a-022cfee37811 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:47.969754538 +0000 UTC m=+10.252780096 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hsw8x" (UniqueName: "kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x") pod "network-check-target-cqm4l" (UID: "0b5fcafd-70a7-4e76-ba7a-022cfee37811") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:44.271428 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:44.270803 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:44.271428 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:44.270856 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:44.271428 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:44.270941 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:48:44.271428 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:44.271375 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:48:46.271681 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:46.270102 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:46.271681 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:46.270239 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:48:46.271681 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:46.270864 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:46.271681 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:46.270981 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:48:47.800159 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:47.800125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs\") pod \"network-metrics-daemon-62nv6\" (UID: \"7c7749f8-7b64-4062-9bca-90c0826a9692\") " pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:47.800639 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:47.800263 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:47.800639 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:47.800321 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs podName:7c7749f8-7b64-4062-9bca-90c0826a9692 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:55.80030672 +0000 UTC m=+18.083332276 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs") pod "network-metrics-daemon-62nv6" (UID: "7c7749f8-7b64-4062-9bca-90c0826a9692") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:48.002788 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:48.002603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsw8x\" (UniqueName: \"kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x\") pod \"network-check-target-cqm4l\" (UID: \"0b5fcafd-70a7-4e76-ba7a-022cfee37811\") " pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:48.002972 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:48.002821 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:48.002972 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:48.002858 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:48.002972 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:48.002871 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hsw8x for pod openshift-network-diagnostics/network-check-target-cqm4l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:48.002972 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:48.002936 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x podName:0b5fcafd-70a7-4e76-ba7a-022cfee37811 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:56.002915122 +0000 UTC m=+18.285940701 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hsw8x" (UniqueName: "kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x") pod "network-check-target-cqm4l" (UID: "0b5fcafd-70a7-4e76-ba7a-022cfee37811") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:48.273801 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:48.272455 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:48.274384 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:48.274361 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:48.274514 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:48.274484 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:48:48.276770 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:48.274679 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:48:50.270291 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:50.270247 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:50.270291 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:50.270288 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:50.270865 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:50.270375 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:48:50.270865 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:50.270514 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:48:52.270510 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:52.270476 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:52.270986 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:52.270587 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:48:52.270986 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:52.270641 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:52.270986 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:52.270737 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:48:54.270862 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:54.270821 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:54.271326 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:54.270826 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:54.271326 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:54.270968 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:48:54.271326 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:54.271045 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:48:55.860895 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:55.860851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs\") pod \"network-metrics-daemon-62nv6\" (UID: \"7c7749f8-7b64-4062-9bca-90c0826a9692\") " pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:55.861395 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:55.860993 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:55.861395 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:55.861070 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs podName:7c7749f8-7b64-4062-9bca-90c0826a9692 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:11.861049171 +0000 UTC m=+34.144074726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs") pod "network-metrics-daemon-62nv6" (UID: "7c7749f8-7b64-4062-9bca-90c0826a9692") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:56.061793 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:56.061743 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsw8x\" (UniqueName: \"kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x\") pod \"network-check-target-cqm4l\" (UID: \"0b5fcafd-70a7-4e76-ba7a-022cfee37811\") " pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:56.061966 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:56.061937 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:56.061966 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:56.061958 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:56.062067 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:56.061969 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hsw8x for pod openshift-network-diagnostics/network-check-target-cqm4l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:56.062067 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:56.062028 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x podName:0b5fcafd-70a7-4e76-ba7a-022cfee37811 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:12.062009781 +0000 UTC m=+34.345035348 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hsw8x" (UniqueName: "kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x") pod "network-check-target-cqm4l" (UID: "0b5fcafd-70a7-4e76-ba7a-022cfee37811") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:56.270413 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:56.270338 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:56.270413 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:56.270378 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:56.270623 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:56.270472 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:48:56.270623 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:56.270609 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:48:58.271148 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.270952 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:48:58.271738 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.271002 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:48:58.271738 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:58.271253 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:48:58.271738 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:48:58.271282 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:48:58.360645 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.360619 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jndw7" event={"ID":"3a0b88db-3420-4832-821b-dc3272b66858","Type":"ContainerStarted","Data":"eb373f0836a6318534688dc102c6393660c7f8cc46e92c2094b8ed7e3ecbc65e"} Apr 16 16:48:58.362077 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.361980 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7dxf7" event={"ID":"c7b5620d-256f-4b38-ac42-4979da7007a4","Type":"ContainerStarted","Data":"ae719246fd9a3f948bc3a1b702648ca2973c9a62eae9b84ac93b6f4cac2e6c96"} Apr 16 16:48:58.363388 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.363365 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-89n6j" event={"ID":"ef20d3fb-2f3a-4af7-97e4-df2e2773f314","Type":"ContainerStarted","Data":"76c20b67dc5b0f1e43a414476b9daff11d62316558666d18fdd8e7218c01721e"} Apr 16 16:48:58.365987 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.365963 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 16:48:58.366315 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.366297 2574 generic.go:358] "Generic (PLEG): container finished" podID="a3db0e84-d73d-4e5b-a7c2-94290b442748" containerID="7dd82b30d50efd69da661190708464893dbf7315d2227178cb0409d69e078791" exitCode=1 Apr 16 16:48:58.366393 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.366357 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" event={"ID":"a3db0e84-d73d-4e5b-a7c2-94290b442748","Type":"ContainerStarted","Data":"1591f41a99a531b4e5c1aa0316310202b975052468ee667657859a2b6cbb7d50"} Apr 16 16:48:58.366393 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.366383 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" event={"ID":"a3db0e84-d73d-4e5b-a7c2-94290b442748","Type":"ContainerStarted","Data":"e1008a3423c43d6636acefb9b31d4294bb93e1d06986847c57139c8a9a3e7665"} Apr 16 16:48:58.366478 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.366396 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" event={"ID":"a3db0e84-d73d-4e5b-a7c2-94290b442748","Type":"ContainerStarted","Data":"d42d02e41f4183df8d2744b71b7bc4dc7ed41d8d54886dbaf90cca7171db444d"} Apr 16 16:48:58.366478 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.366409 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" event={"ID":"a3db0e84-d73d-4e5b-a7c2-94290b442748","Type":"ContainerDied","Data":"7dd82b30d50efd69da661190708464893dbf7315d2227178cb0409d69e078791"} Apr 16 16:48:58.366478 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.366424 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" event={"ID":"a3db0e84-d73d-4e5b-a7c2-94290b442748","Type":"ContainerStarted","Data":"90d6e7d7d4e79d83171c0efe3f2e5e6d75b598dbbb1eca5652c39231ef6b35ef"} Apr 16 16:48:58.367569 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.367539 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bmj4j" event={"ID":"131ec79c-6a30-4f10-b4b5-c529479f0fe0","Type":"ContainerStarted","Data":"d3b48612d3fa4b56e35d148d4b0889a2f2174a85ff45d48beac329be723758c0"} Apr 16 16:48:58.368708 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.368682 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" event={"ID":"6ca8cf77-0136-4fcd-be78-19762c126a37","Type":"ContainerStarted","Data":"d9f871139cf72881368f80fb55ffa988a1d3031f9d26c3dd3ad2db1e3d59946f"} Apr 16 16:48:58.369886 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.369863 2574 generic.go:358] "Generic (PLEG): container finished" podID="fabe57c2-0e2b-42e1-9322-9ea7c5a3f719" containerID="07ec8d3cebd4a06e42a796cf85e3529ae2c7bdb7c6f2180a6e2fab40ee4743c0" exitCode=0 Apr 16 16:48:58.369977 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.369916 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72gts" event={"ID":"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719","Type":"ContainerDied","Data":"07ec8d3cebd4a06e42a796cf85e3529ae2c7bdb7c6f2180a6e2fab40ee4743c0"} Apr 16 16:48:58.371246 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.371225 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" event={"ID":"292362c1-e737-4b91-ac69-fab25c8f024f","Type":"ContainerStarted","Data":"adf1e54a24027a2e29ec7e88d7e0393577dfdf2de0a21ab7a1e2eb405a9769b2"} Apr 16 16:48:58.375458 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.375424 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jndw7" podStartSLOduration=3.461249561 podStartE2EDuration="20.375413405s" podCreationTimestamp="2026-04-16 16:48:38 +0000 UTC" firstStartedPulling="2026-04-16 16:48:40.918243708 +0000 UTC m=+3.201269262" lastFinishedPulling="2026-04-16 16:48:57.832407534 +0000 UTC m=+20.115433106" observedRunningTime="2026-04-16 16:48:58.374993472 +0000 UTC m=+20.658019049" watchObservedRunningTime="2026-04-16 16:48:58.375413405 +0000 UTC m=+20.658438981" Apr 16 16:48:58.375552 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.375487 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" podStartSLOduration=19.375483893 podStartE2EDuration="19.375483893s" podCreationTimestamp="2026-04-16 16:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:48:43.353969644 +0000 UTC m=+5.636995222" watchObservedRunningTime="2026-04-16 16:48:58.375483893 +0000 UTC m=+20.658509468" Apr 16 16:48:58.405671 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.405609 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-89n6j" podStartSLOduration=11.566839674 podStartE2EDuration="20.405590891s" podCreationTimestamp="2026-04-16 16:48:38 +0000 UTC" firstStartedPulling="2026-04-16 16:48:40.923181729 +0000 UTC m=+3.206207289" lastFinishedPulling="2026-04-16 16:48:49.761932948 +0000 UTC m=+12.044958506" observedRunningTime="2026-04-16 16:48:58.389402074 +0000 UTC m=+20.672427651" watchObservedRunningTime="2026-04-16 16:48:58.405590891 +0000 UTC m=+20.688616467" Apr 16 16:48:58.405875 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.405845 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wqv8s" podStartSLOduration=3.458518728 podStartE2EDuration="20.405836616s" podCreationTimestamp="2026-04-16 16:48:38 +0000 UTC" firstStartedPulling="2026-04-16 16:48:40.894554331 +0000 UTC m=+3.177579886" lastFinishedPulling="2026-04-16 16:48:57.841872203 +0000 UTC m=+20.124897774" observedRunningTime="2026-04-16 16:48:58.405583745 +0000 UTC m=+20.688609431" watchObservedRunningTime="2026-04-16 16:48:58.405836616 +0000 UTC m=+20.688862196" Apr 16 16:48:58.420355 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.420301 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bmj4j" podStartSLOduration=3.506216744 podStartE2EDuration="20.420282295s" podCreationTimestamp="2026-04-16 16:48:38 +0000 UTC" firstStartedPulling="2026-04-16 16:48:40.918331723 +0000 UTC m=+3.201357281" lastFinishedPulling="2026-04-16 16:48:57.832397271 +0000 UTC m=+20.115422832" observedRunningTime="2026-04-16 16:48:58.419990135 +0000 UTC m=+20.703015724" watchObservedRunningTime="2026-04-16 16:48:58.420282295 +0000 UTC m=+20.703307935" Apr 16 16:48:58.437515 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:58.437470 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7dxf7" podStartSLOduration=3.449099878 podStartE2EDuration="20.437452917s" podCreationTimestamp="2026-04-16 16:48:38 +0000 UTC" firstStartedPulling="2026-04-16 16:48:40.889951713 +0000 UTC m=+3.172977274" lastFinishedPulling="2026-04-16 16:48:57.878304743 +0000 UTC m=+20.161330313" observedRunningTime="2026-04-16 16:48:58.436996769 +0000 UTC m=+20.720022350" watchObservedRunningTime="2026-04-16 16:48:58.437452917 +0000 UTC m=+20.720478495" Apr 16 16:48:59.375080 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:59.374849 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zjk6l" event={"ID":"75e2551e-500a-44ad-90a9-c6ee9b976f48","Type":"ContainerStarted","Data":"fc985a132164fe8c13bc25ce326a935f0107eb5736a8f7aa21e947909929dac3"} Apr 16 16:48:59.379001 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:59.378934 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 16:48:59.379958 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:59.379891 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" event={"ID":"a3db0e84-d73d-4e5b-a7c2-94290b442748","Type":"ContainerStarted","Data":"28f17cf6c1315d57ab9dee9e2278ccbfdfa668b5bc90c30ac5c4f46fc6c21634"} Apr 16 16:48:59.468265 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:48:59.468240 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:49:00.066968 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:00.066932 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-89n6j" Apr 16 16:49:00.067677 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:00.067636 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-89n6j" Apr 16 16:49:00.083266 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:00.083217 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zjk6l" podStartSLOduration=5.138656873 podStartE2EDuration="22.083199687s" podCreationTimestamp="2026-04-16 16:48:38 +0000 UTC" firstStartedPulling="2026-04-16 16:48:40.887791668 +0000 UTC m=+3.170817226" lastFinishedPulling="2026-04-16 16:48:57.832334472 +0000 UTC m=+20.115360040" observedRunningTime="2026-04-16 16:48:59.392093036 +0000 UTC m=+21.675118612" watchObservedRunningTime="2026-04-16 16:49:00.083199687 +0000 UTC m=+22.366225265" Apr 16 16:49:00.203411 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:00.203294 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:48:59.468258945Z","UUID":"19b7fcb4-16d9-4234-be27-b174fb9afd46","Handler":null,"Name":"","Endpoint":""} Apr 16 16:49:00.205841 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:00.205814 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:49:00.205841 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:00.205843 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:49:00.270037 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:00.270007 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:00.270205 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:00.270012 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:00.270205 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:00.270129 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:49:00.270319 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:00.270224 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:49:00.384056 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:00.383953 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" event={"ID":"292362c1-e737-4b91-ac69-fab25c8f024f","Type":"ContainerStarted","Data":"41ac84a34b11a8e2dca6924419a7104109b0693fc98c66fdcf3c199632f56a25"} Apr 16 16:49:00.384586 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:00.384382 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-89n6j" Apr 16 16:49:00.384804 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:00.384781 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-89n6j" Apr 16 16:49:01.388491 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:01.388464 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 16:49:01.389088 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:01.388901 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" event={"ID":"a3db0e84-d73d-4e5b-a7c2-94290b442748","Type":"ContainerStarted","Data":"46e6f988f8c5e9a03b3b68555edc7de7259dcb2143614aed05749216ecb8db9b"} Apr 16 16:49:01.391031 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:01.391001 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" event={"ID":"292362c1-e737-4b91-ac69-fab25c8f024f","Type":"ContainerStarted","Data":"54ff0178d3c30ea26f748806bdc3069bfe1a76022c11af0c7a6aa58a0a6d3bab"} Apr 16 16:49:01.411151 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:01.411100 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sgrsb" podStartSLOduration=3.335821094 podStartE2EDuration="23.41108564s" podCreationTimestamp="2026-04-16 16:48:38 +0000 UTC" firstStartedPulling="2026-04-16 16:48:40.889323419 +0000 UTC m=+3.172348973" lastFinishedPulling="2026-04-16 16:49:00.964587957 +0000 UTC m=+23.247613519" observedRunningTime="2026-04-16 16:49:01.410601339 +0000 UTC m=+23.693626916" watchObservedRunningTime="2026-04-16 16:49:01.41108564 +0000 UTC m=+23.694111218" Apr 16 16:49:02.270417 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:02.270382 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:02.270604 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:02.270428 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:02.270604 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:02.270522 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:49:02.270724 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:02.270645 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:49:03.397222 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:03.397063 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 16:49:03.397760 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:03.397507 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" event={"ID":"a3db0e84-d73d-4e5b-a7c2-94290b442748","Type":"ContainerStarted","Data":"757e7cefec333c88ea307ed2f8abc88dfa3df0f96770ec5fd6a630f7f0b47cc0"} Apr 16 16:49:03.397827 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:03.397794 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:49:03.397827 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:03.397818 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:49:03.398003 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:03.397985 2574 scope.go:117] "RemoveContainer" containerID="7dd82b30d50efd69da661190708464893dbf7315d2227178cb0409d69e078791" Apr 16 16:49:03.412270 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:03.412251 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:49:04.270704 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:04.270615 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:04.270704 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:04.270615 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:04.270864 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:04.270730 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:49:04.270899 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:04.270861 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:49:04.402125 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:04.402100 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 16:49:04.402506 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:04.402442 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" event={"ID":"a3db0e84-d73d-4e5b-a7c2-94290b442748","Type":"ContainerStarted","Data":"88b425356d8f2a93377db97bfe8ecd9aaf54775b13f4757e6d2cc21ebc4f6e87"} Apr 16 16:49:04.402795 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:04.402771 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:49:04.404243 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:04.404220 2574 generic.go:358] "Generic (PLEG): container finished" podID="fabe57c2-0e2b-42e1-9322-9ea7c5a3f719" containerID="19a5cd48fe9b9d58f1963d410b0f994f525ce6fa5b20ce5d93cdaecc0c6ee616" exitCode=0 Apr 16 16:49:04.404333 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:04.404267 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72gts" event={"ID":"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719","Type":"ContainerDied","Data":"19a5cd48fe9b9d58f1963d410b0f994f525ce6fa5b20ce5d93cdaecc0c6ee616"} Apr 16 16:49:04.417398 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:04.417378 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:49:04.433533 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:04.433497 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" podStartSLOduration=9.457181332 podStartE2EDuration="26.433484719s" podCreationTimestamp="2026-04-16 16:48:38 +0000 UTC" firstStartedPulling="2026-04-16 16:48:40.923120928 +0000 UTC m=+3.206146500" lastFinishedPulling="2026-04-16 16:48:57.899424318 +0000 UTC m=+20.182449887" observedRunningTime="2026-04-16 16:49:04.431451969 +0000 UTC m=+26.714477546" watchObservedRunningTime="2026-04-16 16:49:04.433484719 +0000 UTC m=+26.716510295" Apr 16 16:49:06.270849 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:06.270766 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:06.271249 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:06.270863 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:49:06.271249 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:06.270923 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:06.271249 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:06.271015 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:49:06.408668 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:06.408618 2574 generic.go:358] "Generic (PLEG): container finished" podID="fabe57c2-0e2b-42e1-9322-9ea7c5a3f719" containerID="a4e7ca1ce9313ffb3e7451e601f9d265ed61235dd6ec714e17633d4aeba1498e" exitCode=0 Apr 16 16:49:06.408853 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:06.408695 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72gts" event={"ID":"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719","Type":"ContainerDied","Data":"a4e7ca1ce9313ffb3e7451e601f9d265ed61235dd6ec714e17633d4aeba1498e"} Apr 16 16:49:08.273061 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:08.272830 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:08.273482 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:08.273177 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:49:08.273482 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:08.273402 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:08.273672 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:08.273614 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:49:08.413592 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:08.413560 2574 generic.go:358] "Generic (PLEG): container finished" podID="fabe57c2-0e2b-42e1-9322-9ea7c5a3f719" containerID="7b6031675fd808669eb1826ffcef9c302a576ae7f2aaad2d7246ab9c9a38769f" exitCode=0 Apr 16 16:49:08.413767 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:08.413605 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72gts" event={"ID":"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719","Type":"ContainerDied","Data":"7b6031675fd808669eb1826ffcef9c302a576ae7f2aaad2d7246ab9c9a38769f"} Apr 16 16:49:10.270828 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:10.270785 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:10.271286 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:10.270976 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:49:10.271286 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:10.271036 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:10.271286 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:10.271109 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:49:10.986094 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:10.986054 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-62nv6"] Apr 16 16:49:10.986264 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:10.986179 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:10.986326 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:10.986297 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:49:10.989755 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:10.989687 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cqm4l"] Apr 16 16:49:10.989894 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:10.989778 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:10.989894 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:10.989870 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:49:11.876809 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:11.876774 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs\") pod \"network-metrics-daemon-62nv6\" (UID: \"7c7749f8-7b64-4062-9bca-90c0826a9692\") " pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:11.877476 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:11.876942 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:49:11.877476 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:11.877030 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs podName:7c7749f8-7b64-4062-9bca-90c0826a9692 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:43.877001554 +0000 UTC m=+66.160027124 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs") pod "network-metrics-daemon-62nv6" (UID: "7c7749f8-7b64-4062-9bca-90c0826a9692") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:49:12.078437 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:12.078399 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsw8x\" (UniqueName: \"kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x\") pod \"network-check-target-cqm4l\" (UID: \"0b5fcafd-70a7-4e76-ba7a-022cfee37811\") " pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:12.078614 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:12.078554 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:49:12.078614 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:12.078569 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:49:12.078614 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:12.078581 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hsw8x for pod openshift-network-diagnostics/network-check-target-cqm4l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:49:12.078792 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:12.078644 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x podName:0b5fcafd-70a7-4e76-ba7a-022cfee37811 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:44.07862411 +0000 UTC m=+66.361649668 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-hsw8x" (UniqueName: "kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x") pod "network-check-target-cqm4l" (UID: "0b5fcafd-70a7-4e76-ba7a-022cfee37811") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:49:12.270606 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:12.270532 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:12.270785 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:12.270692 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:49:13.270211 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:13.270181 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:13.270639 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:13.270301 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:49:14.270867 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:14.270689 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:14.271220 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:14.270950 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-62nv6" podUID="7c7749f8-7b64-4062-9bca-90c0826a9692" Apr 16 16:49:15.269797 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:15.269766 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:15.269948 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:15.269862 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cqm4l" podUID="0b5fcafd-70a7-4e76-ba7a-022cfee37811" Apr 16 16:49:15.429250 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:15.429215 2574 generic.go:358] "Generic (PLEG): container finished" podID="fabe57c2-0e2b-42e1-9322-9ea7c5a3f719" containerID="0a4fbfa5ec843f9978511098cb37144815d495409f882934c2247c8b3397de90" exitCode=0 Apr 16 16:49:15.429607 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:15.429278 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72gts" event={"ID":"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719","Type":"ContainerDied","Data":"0a4fbfa5ec843f9978511098cb37144815d495409f882934c2247c8b3397de90"} Apr 16 16:49:16.106543 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.106517 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeReady" Apr 16 16:49:16.106727 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.106687 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:49:16.161752 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.161721 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-j89fm"] Apr 16 16:49:16.175176 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.175149 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jrwq2"] Apr 16 16:49:16.175354 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.175335 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.179591 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.179567 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:49:16.179714 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.179605 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:49:16.179779 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.179729 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fwjqf\"" Apr 16 16:49:16.190113 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.190093 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jrwq2"] Apr 16 16:49:16.190113 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.190115 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j89fm"] Apr 16 16:49:16.190261 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.190125 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-t9n8c"] Apr 16 16:49:16.190261 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.190227 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jrwq2" Apr 16 16:49:16.193264 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.193210 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:49:16.193264 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.193211 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:49:16.193264 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.193218 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:49:16.193615 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.193591 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gwwfm\"" Apr 16 16:49:16.206803 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.206784 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-t9n8c"] Apr 16 16:49:16.206899 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.206889 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.210010 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.209988 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:49:16.210104 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.210020 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jjdf6\"" Apr 16 16:49:16.210169 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.210119 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:49:16.210416 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.210394 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:49:16.210523 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.210472 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:49:16.269943 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.269919 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:16.272715 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.272695 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v7sx7\"" Apr 16 16:49:16.272833 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.272755 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:49:16.306865 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.306836 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dg6p\" (UniqueName: \"kubernetes.io/projected/6b60b842-0962-4590-bd23-8e0739622ebd-kube-api-access-7dg6p\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.306958 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.306868 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75a19bee-ffb4-449a-b349-f0422cebce13-config-volume\") pod \"dns-default-j89fm\" (UID: \"75a19bee-ffb4-449a-b349-f0422cebce13\") " pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.306958 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.306902 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28d3ee6e-fb67-4adb-a55e-7fcafca89c2b-cert\") pod \"ingress-canary-jrwq2\" (UID: \"28d3ee6e-fb67-4adb-a55e-7fcafca89c2b\") " pod="openshift-ingress-canary/ingress-canary-jrwq2" Apr 16 16:49:16.306958 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.306941 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6b60b842-0962-4590-bd23-8e0739622ebd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.307065 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.306959 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r9n6\" (UniqueName: \"kubernetes.io/projected/28d3ee6e-fb67-4adb-a55e-7fcafca89c2b-kube-api-access-9r9n6\") pod \"ingress-canary-jrwq2\" (UID: \"28d3ee6e-fb67-4adb-a55e-7fcafca89c2b\") " pod="openshift-ingress-canary/ingress-canary-jrwq2" Apr 16 16:49:16.307065 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.306978 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6b60b842-0962-4590-bd23-8e0739622ebd-crio-socket\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.307065 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.307014 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6b60b842-0962-4590-bd23-8e0739622ebd-data-volume\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.307065 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.307037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vh9t\" (UniqueName: \"kubernetes.io/projected/75a19bee-ffb4-449a-b349-f0422cebce13-kube-api-access-7vh9t\") pod \"dns-default-j89fm\" (UID: \"75a19bee-ffb4-449a-b349-f0422cebce13\") " pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.307183 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.307087 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a19bee-ffb4-449a-b349-f0422cebce13-metrics-tls\") pod \"dns-default-j89fm\" (UID: \"75a19bee-ffb4-449a-b349-f0422cebce13\") " pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.307183 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.307111 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6b60b842-0962-4590-bd23-8e0739622ebd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.307183 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.307132 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/75a19bee-ffb4-449a-b349-f0422cebce13-tmp-dir\") pod \"dns-default-j89fm\" (UID: \"75a19bee-ffb4-449a-b349-f0422cebce13\") " pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.408190 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6b60b842-0962-4590-bd23-8e0739622ebd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.408190 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/75a19bee-ffb4-449a-b349-f0422cebce13-tmp-dir\") pod \"dns-default-j89fm\" (UID: \"75a19bee-ffb4-449a-b349-f0422cebce13\") " pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.408190 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dg6p\" (UniqueName: \"kubernetes.io/projected/6b60b842-0962-4590-bd23-8e0739622ebd-kube-api-access-7dg6p\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.408392 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408211 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75a19bee-ffb4-449a-b349-f0422cebce13-config-volume\") pod \"dns-default-j89fm\" (UID: \"75a19bee-ffb4-449a-b349-f0422cebce13\") " pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.408392 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408247 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28d3ee6e-fb67-4adb-a55e-7fcafca89c2b-cert\") pod \"ingress-canary-jrwq2\" (UID: \"28d3ee6e-fb67-4adb-a55e-7fcafca89c2b\") " pod="openshift-ingress-canary/ingress-canary-jrwq2" Apr 16 16:49:16.408392 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6b60b842-0962-4590-bd23-8e0739622ebd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.408527 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9r9n6\" (UniqueName: \"kubernetes.io/projected/28d3ee6e-fb67-4adb-a55e-7fcafca89c2b-kube-api-access-9r9n6\") pod \"ingress-canary-jrwq2\" (UID: \"28d3ee6e-fb67-4adb-a55e-7fcafca89c2b\") " pod="openshift-ingress-canary/ingress-canary-jrwq2" Apr 16 16:49:16.408527 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408441 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6b60b842-0962-4590-bd23-8e0739622ebd-crio-socket\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.408527 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408467 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6b60b842-0962-4590-bd23-8e0739622ebd-data-volume\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.408527 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408496 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vh9t\" (UniqueName: \"kubernetes.io/projected/75a19bee-ffb4-449a-b349-f0422cebce13-kube-api-access-7vh9t\") pod \"dns-default-j89fm\" (UID: \"75a19bee-ffb4-449a-b349-f0422cebce13\") " pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.408860 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408568 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a19bee-ffb4-449a-b349-f0422cebce13-metrics-tls\") pod \"dns-default-j89fm\" (UID: \"75a19bee-ffb4-449a-b349-f0422cebce13\") " pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.408860 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408611 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/75a19bee-ffb4-449a-b349-f0422cebce13-tmp-dir\") pod \"dns-default-j89fm\" (UID: \"75a19bee-ffb4-449a-b349-f0422cebce13\") " pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.408860 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408780 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6b60b842-0962-4590-bd23-8e0739622ebd-data-volume\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.408860 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408811 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6b60b842-0962-4590-bd23-8e0739622ebd-crio-socket\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.408860 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408846 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6b60b842-0962-4590-bd23-8e0739622ebd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.409054 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.408846 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75a19bee-ffb4-449a-b349-f0422cebce13-config-volume\") pod \"dns-default-j89fm\" (UID: \"75a19bee-ffb4-449a-b349-f0422cebce13\") " pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.412087 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.412067 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6b60b842-0962-4590-bd23-8e0739622ebd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.412225 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.412208 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28d3ee6e-fb67-4adb-a55e-7fcafca89c2b-cert\") pod \"ingress-canary-jrwq2\" (UID: \"28d3ee6e-fb67-4adb-a55e-7fcafca89c2b\") " pod="openshift-ingress-canary/ingress-canary-jrwq2" Apr 16 16:49:16.412283 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.412256 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75a19bee-ffb4-449a-b349-f0422cebce13-metrics-tls\") pod \"dns-default-j89fm\" (UID: \"75a19bee-ffb4-449a-b349-f0422cebce13\") " pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.417434 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.417414 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dg6p\" (UniqueName: \"kubernetes.io/projected/6b60b842-0962-4590-bd23-8e0739622ebd-kube-api-access-7dg6p\") pod \"insights-runtime-extractor-t9n8c\" (UID: \"6b60b842-0962-4590-bd23-8e0739622ebd\") " pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.418743 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.418721 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vh9t\" (UniqueName: \"kubernetes.io/projected/75a19bee-ffb4-449a-b349-f0422cebce13-kube-api-access-7vh9t\") pod \"dns-default-j89fm\" (UID: \"75a19bee-ffb4-449a-b349-f0422cebce13\") " pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.419422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.419407 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r9n6\" (UniqueName: \"kubernetes.io/projected/28d3ee6e-fb67-4adb-a55e-7fcafca89c2b-kube-api-access-9r9n6\") pod \"ingress-canary-jrwq2\" (UID: \"28d3ee6e-fb67-4adb-a55e-7fcafca89c2b\") " pod="openshift-ingress-canary/ingress-canary-jrwq2" Apr 16 16:49:16.433926 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.433906 2574 generic.go:358] "Generic (PLEG): container finished" podID="fabe57c2-0e2b-42e1-9322-9ea7c5a3f719" containerID="c3f194a56650e9da622cd08709495fb46dd9762a5dff98eb65665e4e095be5b5" exitCode=0 Apr 16 16:49:16.434199 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.433955 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72gts" event={"ID":"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719","Type":"ContainerDied","Data":"c3f194a56650e9da622cd08709495fb46dd9762a5dff98eb65665e4e095be5b5"} Apr 16 16:49:16.485684 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.485642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:16.499390 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.499264 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jrwq2" Apr 16 16:49:16.515730 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.515706 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-t9n8c" Apr 16 16:49:16.645557 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.645501 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jrwq2"] Apr 16 16:49:16.646147 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:16.646121 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28d3ee6e_fb67_4adb_a55e_7fcafca89c2b.slice/crio-dc76259ac03a15350fe9b17541fabcd54f524919f64c5510eb57bc1b9a3f8e70 WatchSource:0}: Error finding container dc76259ac03a15350fe9b17541fabcd54f524919f64c5510eb57bc1b9a3f8e70: Status 404 returned error can't find the container with id dc76259ac03a15350fe9b17541fabcd54f524919f64c5510eb57bc1b9a3f8e70 Apr 16 16:49:16.654324 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.654300 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j89fm"] Apr 16 16:49:16.658032 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:16.658006 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75a19bee_ffb4_449a_b349_f0422cebce13.slice/crio-53939f71999940538ecc2243c437f393f604ce420e0555d2991ad41813e5f231 WatchSource:0}: Error finding container 53939f71999940538ecc2243c437f393f604ce420e0555d2991ad41813e5f231: Status 404 returned error can't find the container with id 53939f71999940538ecc2243c437f393f604ce420e0555d2991ad41813e5f231 Apr 16 16:49:16.669638 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:16.669616 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-t9n8c"] Apr 16 16:49:16.671934 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:16.671911 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b60b842_0962_4590_bd23_8e0739622ebd.slice/crio-78ba6e234e4569577d2a1a35119d10ee801f4a1f8c059688120ec3d9aba01ff9 WatchSource:0}: Error finding container 78ba6e234e4569577d2a1a35119d10ee801f4a1f8c059688120ec3d9aba01ff9: Status 404 returned error can't find the container with id 78ba6e234e4569577d2a1a35119d10ee801f4a1f8c059688120ec3d9aba01ff9 Apr 16 16:49:17.270451 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.270341 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:17.274158 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.273960 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mkc4m\"" Apr 16 16:49:17.274158 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.274056 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:49:17.274347 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.274191 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:49:17.440313 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.440253 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72gts" event={"ID":"fabe57c2-0e2b-42e1-9322-9ea7c5a3f719","Type":"ContainerStarted","Data":"4b0290ac00c640960d9fad8cee101381c884abcaca4b67deb68fc567caaf63a0"} Apr 16 16:49:17.442523 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.442411 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t9n8c" event={"ID":"6b60b842-0962-4590-bd23-8e0739622ebd","Type":"ContainerStarted","Data":"6a83089e8a9657c198f2fa91107ee0a8f99d2099cdf42ca5794954595581b442"} Apr 16 16:49:17.442523 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.442482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t9n8c" event={"ID":"6b60b842-0962-4590-bd23-8e0739622ebd","Type":"ContainerStarted","Data":"3e8c605f8584254015795bc75623a65befe05c33f62a161fe894b8be83282d7c"} Apr 16 16:49:17.442523 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.442503 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t9n8c" event={"ID":"6b60b842-0962-4590-bd23-8e0739622ebd","Type":"ContainerStarted","Data":"78ba6e234e4569577d2a1a35119d10ee801f4a1f8c059688120ec3d9aba01ff9"} Apr 16 16:49:17.443959 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.443933 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jrwq2" event={"ID":"28d3ee6e-fb67-4adb-a55e-7fcafca89c2b","Type":"ContainerStarted","Data":"dc76259ac03a15350fe9b17541fabcd54f524919f64c5510eb57bc1b9a3f8e70"} Apr 16 16:49:17.445100 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.445076 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j89fm" event={"ID":"75a19bee-ffb4-449a-b349-f0422cebce13","Type":"ContainerStarted","Data":"53939f71999940538ecc2243c437f393f604ce420e0555d2991ad41813e5f231"} Apr 16 16:49:17.463529 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.463091 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-72gts" podStartSLOduration=6.050959499 podStartE2EDuration="39.463076972s" podCreationTimestamp="2026-04-16 16:48:38 +0000 UTC" firstStartedPulling="2026-04-16 16:48:40.918255446 +0000 UTC m=+3.201281002" lastFinishedPulling="2026-04-16 16:49:14.330372921 +0000 UTC m=+36.613398475" observedRunningTime="2026-04-16 16:49:17.46167308 +0000 UTC m=+39.744698650" watchObservedRunningTime="2026-04-16 16:49:17.463076972 +0000 UTC m=+39.746102552" Apr 16 16:49:17.958285 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.958248 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b7857647d-xnzkf"] Apr 16 16:49:17.961191 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.961162 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:17.964293 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.964257 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 16:49:17.965809 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.965650 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 16:49:17.965809 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.965679 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 16:49:17.965809 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.965725 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 16:49:17.966026 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.966001 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 16:49:17.966235 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.966124 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 16:49:17.966235 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.966160 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 16:49:17.966235 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.966181 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sdzgj\"" Apr 16 16:49:17.972969 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.972945 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b7857647d-xnzkf"] Apr 16 16:49:17.973073 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:17.973064 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 16:49:18.121184 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.121153 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-oauth-serving-cert\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.121353 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.121197 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-service-ca\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.121353 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.121278 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prnsz\" (UniqueName: \"kubernetes.io/projected/ecad2307-e422-4189-8656-d67ce0f7448b-kube-api-access-prnsz\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.121353 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.121322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-console-config\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.121353 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.121349 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-trusted-ca-bundle\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.121572 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.121464 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecad2307-e422-4189-8656-d67ce0f7448b-console-serving-cert\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.121572 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.121497 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecad2307-e422-4189-8656-d67ce0f7448b-console-oauth-config\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.222142 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.222054 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-oauth-serving-cert\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.222142 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.222088 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-service-ca\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.222142 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.222123 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prnsz\" (UniqueName: \"kubernetes.io/projected/ecad2307-e422-4189-8656-d67ce0f7448b-kube-api-access-prnsz\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.222406 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.222151 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-console-config\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.222406 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.222179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-trusted-ca-bundle\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.222406 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.222264 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecad2307-e422-4189-8656-d67ce0f7448b-console-serving-cert\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.222406 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.222287 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecad2307-e422-4189-8656-d67ce0f7448b-console-oauth-config\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.222945 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.222917 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-oauth-serving-cert\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.223642 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.223263 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-console-config\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.223642 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.223486 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-service-ca\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.224047 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.224002 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-trusted-ca-bundle\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.225166 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.225129 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecad2307-e422-4189-8656-d67ce0f7448b-console-oauth-config\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.226070 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.226051 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecad2307-e422-4189-8656-d67ce0f7448b-console-serving-cert\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.231176 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.231155 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prnsz\" (UniqueName: \"kubernetes.io/projected/ecad2307-e422-4189-8656-d67ce0f7448b-kube-api-access-prnsz\") pod \"console-6b7857647d-xnzkf\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:18.280582 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:18.280548 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:19.049150 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:19.048996 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b7857647d-xnzkf"] Apr 16 16:49:19.052361 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:19.052338 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecad2307_e422_4189_8656_d67ce0f7448b.slice/crio-109b91111ab712631d423c54e483e39c4bd5748b69e8b48706eacae7d962bf3c WatchSource:0}: Error finding container 109b91111ab712631d423c54e483e39c4bd5748b69e8b48706eacae7d962bf3c: Status 404 returned error can't find the container with id 109b91111ab712631d423c54e483e39c4bd5748b69e8b48706eacae7d962bf3c Apr 16 16:49:19.454370 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:19.454340 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j89fm" event={"ID":"75a19bee-ffb4-449a-b349-f0422cebce13","Type":"ContainerStarted","Data":"56be24630760fd3b910d5186100fe80970fd1cf878495ead5756d24495eaa95b"} Apr 16 16:49:19.454370 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:19.454372 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j89fm" event={"ID":"75a19bee-ffb4-449a-b349-f0422cebce13","Type":"ContainerStarted","Data":"22392a224b036e1c79b7f4535e4112668e35ca36b3ffd222e88b3bc597143513"} Apr 16 16:49:19.454572 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:19.454472 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:19.455464 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:19.455440 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b7857647d-xnzkf" event={"ID":"ecad2307-e422-4189-8656-d67ce0f7448b","Type":"ContainerStarted","Data":"109b91111ab712631d423c54e483e39c4bd5748b69e8b48706eacae7d962bf3c"} Apr 16 16:49:19.457087 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:19.457067 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t9n8c" event={"ID":"6b60b842-0962-4590-bd23-8e0739622ebd","Type":"ContainerStarted","Data":"a52b17950f164527f339d9de294ef2520eee9db36cbf9f57e556a51f423439ed"} Apr 16 16:49:19.458186 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:19.458166 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jrwq2" event={"ID":"28d3ee6e-fb67-4adb-a55e-7fcafca89c2b","Type":"ContainerStarted","Data":"a7a04373bb82cbd08e090506957d27c0175078f02197326a9a10d7f38874a010"} Apr 16 16:49:19.474437 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:19.474400 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-j89fm" podStartSLOduration=1.21761641 podStartE2EDuration="3.474390228s" podCreationTimestamp="2026-04-16 16:49:16 +0000 UTC" firstStartedPulling="2026-04-16 16:49:16.659639926 +0000 UTC m=+38.942665481" lastFinishedPulling="2026-04-16 16:49:18.916413744 +0000 UTC m=+41.199439299" observedRunningTime="2026-04-16 16:49:19.474262356 +0000 UTC m=+41.757287934" watchObservedRunningTime="2026-04-16 16:49:19.474390228 +0000 UTC m=+41.757415806" Apr 16 16:49:19.493191 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:19.493152 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-t9n8c" podStartSLOduration=0.967032526 podStartE2EDuration="3.493140923s" podCreationTimestamp="2026-04-16 16:49:16 +0000 UTC" firstStartedPulling="2026-04-16 16:49:16.73972139 +0000 UTC m=+39.022746946" lastFinishedPulling="2026-04-16 16:49:19.265829788 +0000 UTC m=+41.548855343" observedRunningTime="2026-04-16 16:49:19.492316142 +0000 UTC m=+41.775341718" watchObservedRunningTime="2026-04-16 16:49:19.493140923 +0000 UTC m=+41.776166501" Apr 16 16:49:19.508011 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:19.507972 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jrwq2" podStartSLOduration=1.233834819 podStartE2EDuration="3.507962073s" podCreationTimestamp="2026-04-16 16:49:16 +0000 UTC" firstStartedPulling="2026-04-16 16:49:16.647714937 +0000 UTC m=+38.930740491" lastFinishedPulling="2026-04-16 16:49:18.921842177 +0000 UTC m=+41.204867745" observedRunningTime="2026-04-16 16:49:19.507610562 +0000 UTC m=+41.790636136" watchObservedRunningTime="2026-04-16 16:49:19.507962073 +0000 UTC m=+41.790987649" Apr 16 16:49:20.010567 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.010532 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c"] Apr 16 16:49:20.026619 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.026590 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c"] Apr 16 16:49:20.026787 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.026735 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.031529 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.031495 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:49:20.031751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.031550 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-rkvq5\"" Apr 16 16:49:20.031751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.031720 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-4db9b"] Apr 16 16:49:20.031895 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.031868 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:49:20.032096 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.032073 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 16:49:20.032353 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.032334 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:49:20.033214 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.033024 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:49:20.046927 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.046765 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7zrbv"] Apr 16 16:49:20.047883 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.047290 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.050285 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.050230 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:49:20.050285 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.050230 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-zpc6w\"" Apr 16 16:49:20.051037 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.050298 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 16:49:20.051037 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.050897 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 16:49:20.065485 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.065467 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-4db9b"] Apr 16 16:49:20.065602 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.065589 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.068420 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.068401 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:49:20.068420 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.068416 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lxqh8\"" Apr 16 16:49:20.068566 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.068455 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:49:20.068734 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.068717 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:49:20.136580 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.136516 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvzd5\" (UniqueName: \"kubernetes.io/projected/fcc69899-6610-4ce7-803b-eaaaaa23fab8-kube-api-access-dvzd5\") pod \"openshift-state-metrics-5669946b84-z9p6c\" (UID: \"fcc69899-6610-4ce7-803b-eaaaaa23fab8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.136767 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.136599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/6484e384-9c84-470e-869d-55dd04279464-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.136767 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.136641 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6db3b019-e54c-4966-83dd-cc7619e3f221-sys\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.136767 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.136681 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-wtmp\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.136767 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.136707 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6484e384-9c84-470e-869d-55dd04279464-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.136767 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.136740 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fcc69899-6610-4ce7-803b-eaaaaa23fab8-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-z9p6c\" (UID: \"fcc69899-6610-4ce7-803b-eaaaaa23fab8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.136767 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.136765 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwjgk\" (UniqueName: \"kubernetes.io/projected/6484e384-9c84-470e-869d-55dd04279464-kube-api-access-lwjgk\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.137037 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.136818 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-accelerators-collector-config\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.137037 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.136857 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6db3b019-e54c-4966-83dd-cc7619e3f221-metrics-client-ca\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.137037 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.136920 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-textfile\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.137037 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.136955 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-tls\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.137037 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.136990 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6db3b019-e54c-4966-83dd-cc7619e3f221-root\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.137037 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.137019 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fcc69899-6610-4ce7-803b-eaaaaa23fab8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-z9p6c\" (UID: \"fcc69899-6610-4ce7-803b-eaaaaa23fab8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.137290 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.137054 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fcc69899-6610-4ce7-803b-eaaaaa23fab8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-z9p6c\" (UID: \"fcc69899-6610-4ce7-803b-eaaaaa23fab8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.137290 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.137086 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/6484e384-9c84-470e-869d-55dd04279464-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.137290 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.137151 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.137290 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.137185 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6484e384-9c84-470e-869d-55dd04279464-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.137290 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.137247 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6484e384-9c84-470e-869d-55dd04279464-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.137445 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.137298 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkc55\" (UniqueName: \"kubernetes.io/projected/6db3b019-e54c-4966-83dd-cc7619e3f221-kube-api-access-nkc55\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.238506 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238473 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.238746 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238518 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6484e384-9c84-470e-869d-55dd04279464-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.238746 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6484e384-9c84-470e-869d-55dd04279464-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.238746 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkc55\" (UniqueName: \"kubernetes.io/projected/6db3b019-e54c-4966-83dd-cc7619e3f221-kube-api-access-nkc55\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.238746 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238645 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvzd5\" (UniqueName: \"kubernetes.io/projected/fcc69899-6610-4ce7-803b-eaaaaa23fab8-kube-api-access-dvzd5\") pod \"openshift-state-metrics-5669946b84-z9p6c\" (UID: \"fcc69899-6610-4ce7-803b-eaaaaa23fab8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.238746 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238694 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/6484e384-9c84-470e-869d-55dd04279464-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.238746 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238736 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6db3b019-e54c-4966-83dd-cc7619e3f221-sys\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238760 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-wtmp\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238785 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6484e384-9c84-470e-869d-55dd04279464-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238820 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fcc69899-6610-4ce7-803b-eaaaaa23fab8-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-z9p6c\" (UID: \"fcc69899-6610-4ce7-803b-eaaaaa23fab8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwjgk\" (UniqueName: \"kubernetes.io/projected/6484e384-9c84-470e-869d-55dd04279464-kube-api-access-lwjgk\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-accelerators-collector-config\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238915 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6db3b019-e54c-4966-83dd-cc7619e3f221-metrics-client-ca\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238945 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-textfile\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.238979 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-tls\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.239016 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6db3b019-e54c-4966-83dd-cc7619e3f221-root\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.239040 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fcc69899-6610-4ce7-803b-eaaaaa23fab8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-z9p6c\" (UID: \"fcc69899-6610-4ce7-803b-eaaaaa23fab8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.239066 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fcc69899-6610-4ce7-803b-eaaaaa23fab8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-z9p6c\" (UID: \"fcc69899-6610-4ce7-803b-eaaaaa23fab8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.239094 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/6484e384-9c84-470e-869d-55dd04279464-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.239211 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6db3b019-e54c-4966-83dd-cc7619e3f221-sys\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:20.239296 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:20.239357 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-tls podName:6db3b019-e54c-4966-83dd-cc7619e3f221 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:20.739336542 +0000 UTC m=+43.022362097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-tls") pod "node-exporter-7zrbv" (UID: "6db3b019-e54c-4966-83dd-cc7619e3f221") : secret "node-exporter-tls" not found Apr 16 16:49:20.240751 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:20.239699 2574 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 16:49:20.241725 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.239753 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/6484e384-9c84-470e-869d-55dd04279464-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.241725 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.239798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6484e384-9c84-470e-869d-55dd04279464-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.241725 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.239945 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-wtmp\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.241725 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.239953 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-accelerators-collector-config\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.241725 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.240108 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6db3b019-e54c-4966-83dd-cc7619e3f221-root\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.241725 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.240304 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fcc69899-6610-4ce7-803b-eaaaaa23fab8-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-z9p6c\" (UID: \"fcc69899-6610-4ce7-803b-eaaaaa23fab8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.241725 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.240741 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6db3b019-e54c-4966-83dd-cc7619e3f221-metrics-client-ca\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.241725 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.240845 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/6484e384-9c84-470e-869d-55dd04279464-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.241725 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:20.240957 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcc69899-6610-4ce7-803b-eaaaaa23fab8-openshift-state-metrics-tls podName:fcc69899-6610-4ce7-803b-eaaaaa23fab8 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:20.739800572 +0000 UTC m=+43.022826145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/fcc69899-6610-4ce7-803b-eaaaaa23fab8-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-z9p6c" (UID: "fcc69899-6610-4ce7-803b-eaaaaa23fab8") : secret "openshift-state-metrics-tls" not found Apr 16 16:49:20.241725 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.241074 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-textfile\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.241725 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.241466 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6484e384-9c84-470e-869d-55dd04279464-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.242350 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.242306 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.242626 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.242608 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fcc69899-6610-4ce7-803b-eaaaaa23fab8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-z9p6c\" (UID: \"fcc69899-6610-4ce7-803b-eaaaaa23fab8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.243017 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.242990 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6484e384-9c84-470e-869d-55dd04279464-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.254367 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.254346 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvzd5\" (UniqueName: \"kubernetes.io/projected/fcc69899-6610-4ce7-803b-eaaaaa23fab8-kube-api-access-dvzd5\") pod \"openshift-state-metrics-5669946b84-z9p6c\" (UID: \"fcc69899-6610-4ce7-803b-eaaaaa23fab8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.254456 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.254370 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwjgk\" (UniqueName: \"kubernetes.io/projected/6484e384-9c84-470e-869d-55dd04279464-kube-api-access-lwjgk\") pod \"kube-state-metrics-7479c89684-4db9b\" (UID: \"6484e384-9c84-470e-869d-55dd04279464\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.256405 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.256383 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkc55\" (UniqueName: \"kubernetes.io/projected/6db3b019-e54c-4966-83dd-cc7619e3f221-kube-api-access-nkc55\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.357203 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.357159 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" Apr 16 16:49:20.488450 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.488418 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-4db9b"] Apr 16 16:49:20.491683 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:20.491645 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6484e384_9c84_470e_869d_55dd04279464.slice/crio-63390d43d25750c97e69028931e9410f5cf639f4d4fa8ecbe019b15e0b97195d WatchSource:0}: Error finding container 63390d43d25750c97e69028931e9410f5cf639f4d4fa8ecbe019b15e0b97195d: Status 404 returned error can't find the container with id 63390d43d25750c97e69028931e9410f5cf639f4d4fa8ecbe019b15e0b97195d Apr 16 16:49:20.744703 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.744602 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-tls\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.744703 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.744646 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fcc69899-6610-4ce7-803b-eaaaaa23fab8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-z9p6c\" (UID: \"fcc69899-6610-4ce7-803b-eaaaaa23fab8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.746871 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.746846 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6db3b019-e54c-4966-83dd-cc7619e3f221-node-exporter-tls\") pod \"node-exporter-7zrbv\" (UID: \"6db3b019-e54c-4966-83dd-cc7619e3f221\") " pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:20.746994 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.746977 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fcc69899-6610-4ce7-803b-eaaaaa23fab8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-z9p6c\" (UID: \"fcc69899-6610-4ce7-803b-eaaaaa23fab8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.938907 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.938867 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" Apr 16 16:49:20.975869 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:20.975840 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7zrbv" Apr 16 16:49:21.085735 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.085679 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:49:21.110276 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.110247 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:49:21.110475 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.110454 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.114137 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.114115 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 16:49:21.114262 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.114177 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 16:49:21.114461 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.114441 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-rd9s4\"" Apr 16 16:49:21.114541 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.114476 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 16:49:21.114608 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.114561 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 16:49:21.114721 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.114689 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 16:49:21.114797 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.114693 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 16:49:21.114900 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.114848 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 16:49:21.115009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.114966 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 16:49:21.115236 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.115220 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 16:49:21.247763 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.247727 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2baf6227-4885-49ec-a971-4601fc362ec0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.247939 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.247783 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-web-config\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.247939 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.247866 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.247939 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.247901 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.247939 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.247931 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2baf6227-4885-49ec-a971-4601fc362ec0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.248107 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.248003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2baf6227-4885-49ec-a971-4601fc362ec0-config-out\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.248107 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.248031 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-config-volume\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.248107 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.248064 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2baf6227-4885-49ec-a971-4601fc362ec0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.248222 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.248119 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqssm\" (UniqueName: \"kubernetes.io/projected/2baf6227-4885-49ec-a971-4601fc362ec0-kube-api-access-nqssm\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.248222 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.248151 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.248222 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.248176 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2baf6227-4885-49ec-a971-4601fc362ec0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.248222 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.248216 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.248408 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.248257 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.349030 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.348934 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.349030 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.348988 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2baf6227-4885-49ec-a971-4601fc362ec0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.349248 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.349050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.349248 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.349101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.349248 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.349129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2baf6227-4885-49ec-a971-4601fc362ec0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.349248 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.349158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-web-config\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.349248 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.349179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.349248 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.349198 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.349248 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.349225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2baf6227-4885-49ec-a971-4601fc362ec0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.349559 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.349275 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2baf6227-4885-49ec-a971-4601fc362ec0-config-out\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.349559 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.349302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-config-volume\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.349559 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.349332 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2baf6227-4885-49ec-a971-4601fc362ec0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.349559 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.349364 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqssm\" (UniqueName: \"kubernetes.io/projected/2baf6227-4885-49ec-a971-4601fc362ec0-kube-api-access-nqssm\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.350306 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:21.349856 2574 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 16:49:21.350306 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:49:21.349932 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-main-tls podName:2baf6227-4885-49ec-a971-4601fc362ec0 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:21.849913485 +0000 UTC m=+44.132939048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0") : secret "alertmanager-main-tls" not found Apr 16 16:49:21.350306 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.349852 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2baf6227-4885-49ec-a971-4601fc362ec0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.350919 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.350895 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2baf6227-4885-49ec-a971-4601fc362ec0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.351148 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.351120 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2baf6227-4885-49ec-a971-4601fc362ec0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.352906 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.352871 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.353294 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.353268 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.353526 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.353459 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-config-volume\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.353526 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.353467 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2baf6227-4885-49ec-a971-4601fc362ec0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.353827 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.353808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2baf6227-4885-49ec-a971-4601fc362ec0-config-out\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.354152 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.354116 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.354224 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.354152 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-web-config\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.354984 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.354958 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.359551 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.359531 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqssm\" (UniqueName: \"kubernetes.io/projected/2baf6227-4885-49ec-a971-4601fc362ec0-kube-api-access-nqssm\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.467617 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.467576 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" event={"ID":"6484e384-9c84-470e-869d-55dd04279464","Type":"ContainerStarted","Data":"63390d43d25750c97e69028931e9410f5cf639f4d4fa8ecbe019b15e0b97195d"} Apr 16 16:49:21.854802 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.854765 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.857117 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.857088 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:21.884106 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:21.884062 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6db3b019_e54c_4966_83dd_cc7619e3f221.slice/crio-65da040fbf59de12a5011b7ee349164070e68badab5b53b4eba06b0534d60ab8 WatchSource:0}: Error finding container 65da040fbf59de12a5011b7ee349164070e68badab5b53b4eba06b0534d60ab8: Status 404 returned error can't find the container with id 65da040fbf59de12a5011b7ee349164070e68badab5b53b4eba06b0534d60ab8 Apr 16 16:49:21.990070 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:21.990042 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c"] Apr 16 16:49:21.993401 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:21.993339 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcc69899_6610_4ce7_803b_eaaaaa23fab8.slice/crio-3729da04d7b227db831e61044e7c3834bbe66cbefeca5876885d1d39cc1d7b95 WatchSource:0}: Error finding container 3729da04d7b227db831e61044e7c3834bbe66cbefeca5876885d1d39cc1d7b95: Status 404 returned error can't find the container with id 3729da04d7b227db831e61044e7c3834bbe66cbefeca5876885d1d39cc1d7b95 Apr 16 16:49:22.021724 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.021694 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:49:22.094345 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.094316 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-848496fcb-44nmq"] Apr 16 16:49:22.103126 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.103103 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.106269 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.106207 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-p5pwz\"" Apr 16 16:49:22.106269 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.106217 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 16:49:22.106269 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.106232 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 16:49:22.106471 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.106279 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 16:49:22.106569 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.106554 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 16:49:22.106619 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.106574 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-1cr01q00apn02\"" Apr 16 16:49:22.107631 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.107611 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 16:49:22.110091 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.110068 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-848496fcb-44nmq"] Apr 16 16:49:22.258341 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.258259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.258341 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.258330 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5720f2ab-ce1b-4958-97fb-6899db6301a3-metrics-client-ca\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.258534 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.258418 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.258534 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.258465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.258534 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.258500 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-tls\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.258534 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.258528 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v2fl\" (UniqueName: \"kubernetes.io/projected/5720f2ab-ce1b-4958-97fb-6899db6301a3-kube-api-access-9v2fl\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.258769 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.258565 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.258769 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.258699 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-grpc-tls\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.267840 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.267804 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:49:22.273068 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:22.273029 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2baf6227_4885_49ec_a971_4601fc362ec0.slice/crio-3bf999ee219a35d06892991d09352423b7f52583123d619e28b7d14697fd195b WatchSource:0}: Error finding container 3bf999ee219a35d06892991d09352423b7f52583123d619e28b7d14697fd195b: Status 404 returned error can't find the container with id 3bf999ee219a35d06892991d09352423b7f52583123d619e28b7d14697fd195b Apr 16 16:49:22.359931 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.359730 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-grpc-tls\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.359931 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.359816 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.359931 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.359874 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5720f2ab-ce1b-4958-97fb-6899db6301a3-metrics-client-ca\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.359931 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.359917 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.360197 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.359950 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.360197 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.359985 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-tls\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.360197 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.360009 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2fl\" (UniqueName: \"kubernetes.io/projected/5720f2ab-ce1b-4958-97fb-6899db6301a3-kube-api-access-9v2fl\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.360197 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.360047 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.361508 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.361454 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5720f2ab-ce1b-4958-97fb-6899db6301a3-metrics-client-ca\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.363179 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.362996 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.363894 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.363846 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.364005 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.363895 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-grpc-tls\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.365279 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.365254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-tls\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.365861 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.365816 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.365968 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.365904 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5720f2ab-ce1b-4958-97fb-6899db6301a3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.369548 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.369528 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v2fl\" (UniqueName: \"kubernetes.io/projected/5720f2ab-ce1b-4958-97fb-6899db6301a3-kube-api-access-9v2fl\") pod \"thanos-querier-848496fcb-44nmq\" (UID: \"5720f2ab-ce1b-4958-97fb-6899db6301a3\") " pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.413909 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.413794 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:22.474634 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.473868 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b7857647d-xnzkf" event={"ID":"ecad2307-e422-4189-8656-d67ce0f7448b","Type":"ContainerStarted","Data":"b6269a23874ba7b693fe913d83c5293ce8cc43b6e262f4987f3acb25e7a9d8ab"} Apr 16 16:49:22.478106 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.478034 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" event={"ID":"fcc69899-6610-4ce7-803b-eaaaaa23fab8","Type":"ContainerStarted","Data":"c03b4544156dc011a49ee155678d06cb942ec0e241b30707f67e98725c2db6a6"} Apr 16 16:49:22.478106 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.478072 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" event={"ID":"fcc69899-6610-4ce7-803b-eaaaaa23fab8","Type":"ContainerStarted","Data":"5d6c53d45c8e872fcadf85400ad9b7009d4eea05f9a23b2daff1838686dca4c3"} Apr 16 16:49:22.478106 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.478085 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" event={"ID":"fcc69899-6610-4ce7-803b-eaaaaa23fab8","Type":"ContainerStarted","Data":"3729da04d7b227db831e61044e7c3834bbe66cbefeca5876885d1d39cc1d7b95"} Apr 16 16:49:22.479275 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.479213 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7zrbv" event={"ID":"6db3b019-e54c-4966-83dd-cc7619e3f221","Type":"ContainerStarted","Data":"65da040fbf59de12a5011b7ee349164070e68badab5b53b4eba06b0534d60ab8"} Apr 16 16:49:22.482239 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.482203 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerStarted","Data":"3bf999ee219a35d06892991d09352423b7f52583123d619e28b7d14697fd195b"} Apr 16 16:49:22.493300 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.491973 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b7857647d-xnzkf" podStartSLOduration=2.403582303 podStartE2EDuration="5.491956746s" podCreationTimestamp="2026-04-16 16:49:17 +0000 UTC" firstStartedPulling="2026-04-16 16:49:19.054600106 +0000 UTC m=+41.337625662" lastFinishedPulling="2026-04-16 16:49:22.142974538 +0000 UTC m=+44.426000105" observedRunningTime="2026-04-16 16:49:22.491363318 +0000 UTC m=+44.774388899" watchObservedRunningTime="2026-04-16 16:49:22.491956746 +0000 UTC m=+44.774982314" Apr 16 16:49:22.561269 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:22.561232 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-848496fcb-44nmq"] Apr 16 16:49:22.563954 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:22.563918 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5720f2ab_ce1b_4958_97fb_6899db6301a3.slice/crio-8c82a17e400fd533088fa905094a5a39101fd7a5c68233dd905826a5cc605082 WatchSource:0}: Error finding container 8c82a17e400fd533088fa905094a5a39101fd7a5c68233dd905826a5cc605082: Status 404 returned error can't find the container with id 8c82a17e400fd533088fa905094a5a39101fd7a5c68233dd905826a5cc605082 Apr 16 16:49:23.487012 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:23.486963 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" event={"ID":"5720f2ab-ce1b-4958-97fb-6899db6301a3","Type":"ContainerStarted","Data":"8c82a17e400fd533088fa905094a5a39101fd7a5c68233dd905826a5cc605082"} Apr 16 16:49:24.517688 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.517642 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-79468b9bf5-qh26g"] Apr 16 16:49:24.542616 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.542584 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-79468b9bf5-qh26g"] Apr 16 16:49:24.542776 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.542640 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.545909 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.545888 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 16:49:24.547429 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.547400 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:49:24.547429 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.547425 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 16:49:24.547606 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.547441 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 16:49:24.547606 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.547473 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-ebdt160f7etlg\"" Apr 16 16:49:24.547761 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.547743 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-fhkxr\"" Apr 16 16:49:24.683353 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.683321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3481af-decf-4c56-b396-527c1e009a30-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.683511 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.683401 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ba3481af-decf-4c56-b396-527c1e009a30-audit-log\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.683511 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.683430 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkm4q\" (UniqueName: \"kubernetes.io/projected/ba3481af-decf-4c56-b396-527c1e009a30-kube-api-access-qkm4q\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.683511 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.683455 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ba3481af-decf-4c56-b396-527c1e009a30-metrics-server-audit-profiles\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.683620 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.683508 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3481af-decf-4c56-b396-527c1e009a30-client-ca-bundle\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.683620 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.683558 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ba3481af-decf-4c56-b396-527c1e009a30-secret-metrics-server-tls\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.683720 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.683675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ba3481af-decf-4c56-b396-527c1e009a30-secret-metrics-server-client-certs\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.785119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.784798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3481af-decf-4c56-b396-527c1e009a30-client-ca-bundle\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.785119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.784852 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ba3481af-decf-4c56-b396-527c1e009a30-secret-metrics-server-tls\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.785119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.784919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ba3481af-decf-4c56-b396-527c1e009a30-secret-metrics-server-client-certs\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.785119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.784985 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3481af-decf-4c56-b396-527c1e009a30-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.785436 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.785034 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ba3481af-decf-4c56-b396-527c1e009a30-audit-log\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.785436 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.785165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkm4q\" (UniqueName: \"kubernetes.io/projected/ba3481af-decf-4c56-b396-527c1e009a30-kube-api-access-qkm4q\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.785436 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.785210 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ba3481af-decf-4c56-b396-527c1e009a30-metrics-server-audit-profiles\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.785702 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.785636 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ba3481af-decf-4c56-b396-527c1e009a30-audit-log\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.785937 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.785914 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3481af-decf-4c56-b396-527c1e009a30-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.786203 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.786183 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ba3481af-decf-4c56-b396-527c1e009a30-metrics-server-audit-profiles\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.786344 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.786318 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-j58zk"] Apr 16 16:49:24.788141 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.788114 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ba3481af-decf-4c56-b396-527c1e009a30-secret-metrics-server-tls\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.788221 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.788156 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3481af-decf-4c56-b396-527c1e009a30-client-ca-bundle\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.788298 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.788251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ba3481af-decf-4c56-b396-527c1e009a30-secret-metrics-server-client-certs\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.805427 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.805407 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkm4q\" (UniqueName: \"kubernetes.io/projected/ba3481af-decf-4c56-b396-527c1e009a30-kube-api-access-qkm4q\") pod \"metrics-server-79468b9bf5-qh26g\" (UID: \"ba3481af-decf-4c56-b396-527c1e009a30\") " pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.816973 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.816949 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-j58zk"] Apr 16 16:49:24.817087 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.816989 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j58zk" Apr 16 16:49:24.819698 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.819675 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 16:49:24.819939 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.819925 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-mf6fd\"" Apr 16 16:49:24.853534 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.853513 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:24.986988 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:24.986955 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2224a16f-0022-488d-a462-4d5ab10cb7bb-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-j58zk\" (UID: \"2224a16f-0022-488d-a462-4d5ab10cb7bb\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j58zk" Apr 16 16:49:25.088202 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.088168 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2224a16f-0022-488d-a462-4d5ab10cb7bb-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-j58zk\" (UID: \"2224a16f-0022-488d-a462-4d5ab10cb7bb\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j58zk" Apr 16 16:49:25.090394 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.090376 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2224a16f-0022-488d-a462-4d5ab10cb7bb-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-j58zk\" (UID: \"2224a16f-0022-488d-a462-4d5ab10cb7bb\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j58zk" Apr 16 16:49:25.127567 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.127536 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j58zk" Apr 16 16:49:25.230434 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.230400 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7"] Apr 16 16:49:25.255696 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.255667 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7"] Apr 16 16:49:25.255823 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.255724 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.258993 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.258966 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 16:49:25.259113 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.258994 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 16:49:25.259216 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.259204 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 16:49:25.259399 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.259381 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 16:49:25.259520 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.259408 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 16:49:25.259631 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.259616 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-wgxdd\"" Apr 16 16:49:25.265142 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.265125 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 16:49:25.391339 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.390829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dda195a-5e85-4304-a192-8af66801e3e5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.391435 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.391372 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dda195a-5e85-4304-a192-8af66801e3e5-metrics-client-ca\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.391504 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.391456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3dda195a-5e85-4304-a192-8af66801e3e5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.391564 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.391538 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3dda195a-5e85-4304-a192-8af66801e3e5-federate-client-tls\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.391617 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.391571 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mpzf\" (UniqueName: \"kubernetes.io/projected/3dda195a-5e85-4304-a192-8af66801e3e5-kube-api-access-8mpzf\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.391748 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.391723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3dda195a-5e85-4304-a192-8af66801e3e5-secret-telemeter-client\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.391884 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.391849 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dda195a-5e85-4304-a192-8af66801e3e5-serving-certs-ca-bundle\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.391955 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.391912 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3dda195a-5e85-4304-a192-8af66801e3e5-telemeter-client-tls\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.451034 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.450984 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-79468b9bf5-qh26g"] Apr 16 16:49:25.460799 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:25.460769 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba3481af_decf_4c56_b396_527c1e009a30.slice/crio-f30713dfde7ae4da32d28ff5b04be3157f2a461b5696f1be0de67ef630c9ef33 WatchSource:0}: Error finding container f30713dfde7ae4da32d28ff5b04be3157f2a461b5696f1be0de67ef630c9ef33: Status 404 returned error can't find the container with id f30713dfde7ae4da32d28ff5b04be3157f2a461b5696f1be0de67ef630c9ef33 Apr 16 16:49:25.471283 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.470937 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-j58zk"] Apr 16 16:49:25.473515 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:25.473492 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2224a16f_0022_488d_a462_4d5ab10cb7bb.slice/crio-78b4ca7465fcacfca51874d758b075fd0a9eb09cb67ec07e0c6f73ca2e66381f WatchSource:0}: Error finding container 78b4ca7465fcacfca51874d758b075fd0a9eb09cb67ec07e0c6f73ca2e66381f: Status 404 returned error can't find the container with id 78b4ca7465fcacfca51874d758b075fd0a9eb09cb67ec07e0c6f73ca2e66381f Apr 16 16:49:25.492827 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.492800 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3dda195a-5e85-4304-a192-8af66801e3e5-secret-telemeter-client\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.492928 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.492855 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dda195a-5e85-4304-a192-8af66801e3e5-serving-certs-ca-bundle\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.492928 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.492883 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3dda195a-5e85-4304-a192-8af66801e3e5-telemeter-client-tls\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.492928 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.492911 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dda195a-5e85-4304-a192-8af66801e3e5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.493105 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.492939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dda195a-5e85-4304-a192-8af66801e3e5-metrics-client-ca\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.493105 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.492970 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3dda195a-5e85-4304-a192-8af66801e3e5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.493105 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.493050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3dda195a-5e85-4304-a192-8af66801e3e5-federate-client-tls\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.493105 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.493080 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mpzf\" (UniqueName: \"kubernetes.io/projected/3dda195a-5e85-4304-a192-8af66801e3e5-kube-api-access-8mpzf\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.494916 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.494269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dda195a-5e85-4304-a192-8af66801e3e5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.494916 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.494621 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dda195a-5e85-4304-a192-8af66801e3e5-metrics-client-ca\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.500295 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.499861 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dda195a-5e85-4304-a192-8af66801e3e5-serving-certs-ca-bundle\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.501685 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.500497 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3dda195a-5e85-4304-a192-8af66801e3e5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.502195 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.502154 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3dda195a-5e85-4304-a192-8af66801e3e5-federate-client-tls\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.502283 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.502266 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3dda195a-5e85-4304-a192-8af66801e3e5-secret-telemeter-client\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.504996 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.504973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mpzf\" (UniqueName: \"kubernetes.io/projected/3dda195a-5e85-4304-a192-8af66801e3e5-kube-api-access-8mpzf\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.507263 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.507224 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3dda195a-5e85-4304-a192-8af66801e3e5-telemeter-client-tls\") pod \"telemeter-client-f9bf64dc9-4xqt7\" (UID: \"3dda195a-5e85-4304-a192-8af66801e3e5\") " pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.507350 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.507320 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7zrbv" event={"ID":"6db3b019-e54c-4966-83dd-cc7619e3f221","Type":"ContainerStarted","Data":"2823f16a30deedc5e3585c1065bdec181800ffc330294620ed81722cccd487c1"} Apr 16 16:49:25.508486 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.508460 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" event={"ID":"ba3481af-decf-4c56-b396-527c1e009a30","Type":"ContainerStarted","Data":"f30713dfde7ae4da32d28ff5b04be3157f2a461b5696f1be0de67ef630c9ef33"} Apr 16 16:49:25.511177 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.511071 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j58zk" event={"ID":"2224a16f-0022-488d-a462-4d5ab10cb7bb","Type":"ContainerStarted","Data":"78b4ca7465fcacfca51874d758b075fd0a9eb09cb67ec07e0c6f73ca2e66381f"} Apr 16 16:49:25.565784 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.565761 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" Apr 16 16:49:25.734073 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:25.733976 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7"] Apr 16 16:49:25.765069 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:25.765034 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dda195a_5e85_4304_a192_8af66801e3e5.slice/crio-27edce2fd6936a4b8909f5a9dafa89f2691c61b1a0051f381d5d60b71a9edd73 WatchSource:0}: Error finding container 27edce2fd6936a4b8909f5a9dafa89f2691c61b1a0051f381d5d60b71a9edd73: Status 404 returned error can't find the container with id 27edce2fd6936a4b8909f5a9dafa89f2691c61b1a0051f381d5d60b71a9edd73 Apr 16 16:49:26.291333 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.291294 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:49:26.310777 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.310707 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:49:26.310949 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.310882 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.313947 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.313809 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 16:49:26.314199 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.314175 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 16:49:26.314627 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.314183 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 16:49:26.314627 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.314395 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 16:49:26.314627 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.314547 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 16:49:26.314627 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.314612 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 16:49:26.314627 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.314619 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 16:49:26.314899 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.314755 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 16:49:26.314899 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.314860 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 16:49:26.316803 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.315807 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5cjngfjhoetgr\"" Apr 16 16:49:26.316803 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.316067 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 16:49:26.316803 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.316349 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jnhsj\"" Apr 16 16:49:26.317481 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.317319 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 16:49:26.321910 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.321676 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 16:49:26.404009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.403255 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.403526 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.403576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.403698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/197ed074-ec82-48e8-9e15-b998bc2e8e53-config-out\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.403733 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-web-config\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.403773 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.403817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.403850 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9spjt\" (UniqueName: \"kubernetes.io/projected/197ed074-ec82-48e8-9e15-b998bc2e8e53-kube-api-access-9spjt\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.403881 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.403921 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.403951 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-config\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.403983 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.404023 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404729 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.404068 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404729 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.404098 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404729 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.404147 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404729 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.404209 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/197ed074-ec82-48e8-9e15-b998bc2e8e53-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.404729 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.404238 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.505607 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.505678 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.505854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.505891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/197ed074-ec82-48e8-9e15-b998bc2e8e53-config-out\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.505921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-web-config\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.505957 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.505992 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.506015 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9spjt\" (UniqueName: \"kubernetes.io/projected/197ed074-ec82-48e8-9e15-b998bc2e8e53-kube-api-access-9spjt\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.506041 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.506074 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.506099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-config\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506758 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.506171 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506758 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.506207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506758 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.506244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506758 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.506267 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506758 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.506311 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506758 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.506360 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/197ed074-ec82-48e8-9e15-b998bc2e8e53-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.506758 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.506385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.510206 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.509567 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.510206 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.510168 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.510562 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.510399 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.511341 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.511315 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.518980 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.518105 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" event={"ID":"6484e384-9c84-470e-869d-55dd04279464","Type":"ContainerStarted","Data":"2d66597278a74bc6e548c9c12de3932dde3b2b5e11fc4e47271dbf846fda4413"} Apr 16 16:49:26.518980 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.518139 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" event={"ID":"6484e384-9c84-470e-869d-55dd04279464","Type":"ContainerStarted","Data":"290d656bae54816f27025539aa01267114732579c4f582f1a219ab5dee41f7e3"} Apr 16 16:49:26.518980 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.518153 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" event={"ID":"6484e384-9c84-470e-869d-55dd04279464","Type":"ContainerStarted","Data":"41f4d71bf0687f9a9fa3b8a57952250605f1725b49dd00739b7a85f1a2cf4fde"} Apr 16 16:49:26.526266 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.526049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.529615 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.528984 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" event={"ID":"fcc69899-6610-4ce7-803b-eaaaaa23fab8","Type":"ContainerStarted","Data":"e9b35966589172b5d812eaf5cb9324634e4498ba756b003e3f22a486e51d21c2"} Apr 16 16:49:26.532542 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.532177 2574 generic.go:358] "Generic (PLEG): container finished" podID="6db3b019-e54c-4966-83dd-cc7619e3f221" containerID="2823f16a30deedc5e3585c1065bdec181800ffc330294620ed81722cccd487c1" exitCode=0 Apr 16 16:49:26.532542 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.532255 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7zrbv" event={"ID":"6db3b019-e54c-4966-83dd-cc7619e3f221","Type":"ContainerDied","Data":"2823f16a30deedc5e3585c1065bdec181800ffc330294620ed81722cccd487c1"} Apr 16 16:49:26.536678 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.536453 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" event={"ID":"5720f2ab-ce1b-4958-97fb-6899db6301a3","Type":"ContainerStarted","Data":"8ef64b229c3dde274db9bfc9996b2e53a022be5007f3f6c396535bdcfd4b7068"} Apr 16 16:49:26.536678 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.536486 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" event={"ID":"5720f2ab-ce1b-4958-97fb-6899db6301a3","Type":"ContainerStarted","Data":"faff6266dd3dcbce0380e8f5552cac33bc68a7319910ee2f91f441a2d6f1ead4"} Apr 16 16:49:26.536678 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.536499 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" event={"ID":"5720f2ab-ce1b-4958-97fb-6899db6301a3","Type":"ContainerStarted","Data":"eb12af490ce354901033cc8f78f84cd7a14c8e8a1d0316fdda9c755bafd49589"} Apr 16 16:49:26.539729 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.538616 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-4db9b" podStartSLOduration=1.7503812760000002 podStartE2EDuration="6.538600917s" podCreationTimestamp="2026-04-16 16:49:20 +0000 UTC" firstStartedPulling="2026-04-16 16:49:20.494798572 +0000 UTC m=+42.777824141" lastFinishedPulling="2026-04-16 16:49:25.283018224 +0000 UTC m=+47.566043782" observedRunningTime="2026-04-16 16:49:26.537699345 +0000 UTC m=+48.820724923" watchObservedRunningTime="2026-04-16 16:49:26.538600917 +0000 UTC m=+48.821626495" Apr 16 16:49:26.541277 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.541213 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerDied","Data":"0cf82074f1ff0b09f0dc6e58bf68fe362d3e78c848c42c434e509f67b496dc36"} Apr 16 16:49:26.541449 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.541385 2574 generic.go:358] "Generic (PLEG): container finished" podID="2baf6227-4885-49ec-a971-4601fc362ec0" containerID="0cf82074f1ff0b09f0dc6e58bf68fe362d3e78c848c42c434e509f67b496dc36" exitCode=0 Apr 16 16:49:26.543135 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.543074 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/197ed074-ec82-48e8-9e15-b998bc2e8e53-config-out\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.543368 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.543343 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" event={"ID":"3dda195a-5e85-4304-a192-8af66801e3e5","Type":"ContainerStarted","Data":"27edce2fd6936a4b8909f5a9dafa89f2691c61b1a0051f381d5d60b71a9edd73"} Apr 16 16:49:26.547171 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.547119 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-config\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.557517 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.557484 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.559545 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.559500 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/197ed074-ec82-48e8-9e15-b998bc2e8e53-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.566634 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.566610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.578087 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.578044 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.581959 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.581609 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z9p6c" podStartSLOduration=4.6390662240000005 podStartE2EDuration="7.581594272s" podCreationTimestamp="2026-04-16 16:49:19 +0000 UTC" firstStartedPulling="2026-04-16 16:49:22.364452176 +0000 UTC m=+44.647477734" lastFinishedPulling="2026-04-16 16:49:25.306980225 +0000 UTC m=+47.590005782" observedRunningTime="2026-04-16 16:49:26.557045497 +0000 UTC m=+48.840071079" watchObservedRunningTime="2026-04-16 16:49:26.581594272 +0000 UTC m=+48.864619850" Apr 16 16:49:26.587264 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.587220 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.587509 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.587487 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.587773 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.587754 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-web-config\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.594947 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.594905 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.595042 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.595000 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.595148 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.595053 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.595263 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.595238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9spjt\" (UniqueName: \"kubernetes.io/projected/197ed074-ec82-48e8-9e15-b998bc2e8e53-kube-api-access-9spjt\") pod \"prometheus-k8s-0\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:26.624619 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:26.624592 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:28.280928 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:28.280897 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:28.280928 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:28.280939 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:28.287367 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:28.287343 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:28.552000 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:28.551978 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:49:28.786285 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:28.786265 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:49:29.092468 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:29.092424 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod197ed074_ec82_48e8_9e15_b998bc2e8e53.slice/crio-10e122c116ea0c8478ae1b2000ea1608f196e51b5abe9e5400b501b81047fa5b WatchSource:0}: Error finding container 10e122c116ea0c8478ae1b2000ea1608f196e51b5abe9e5400b501b81047fa5b: Status 404 returned error can't find the container with id 10e122c116ea0c8478ae1b2000ea1608f196e51b5abe9e5400b501b81047fa5b Apr 16 16:49:29.466085 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.466054 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-j89fm" Apr 16 16:49:29.554414 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.554380 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7zrbv" event={"ID":"6db3b019-e54c-4966-83dd-cc7619e3f221","Type":"ContainerStarted","Data":"2083ba5343becc282b4ea58c68e1dae20a82c15a48332da8f40f552bd611a6ab"} Apr 16 16:49:29.554529 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.554422 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7zrbv" event={"ID":"6db3b019-e54c-4966-83dd-cc7619e3f221","Type":"ContainerStarted","Data":"07057c9f369c50c6906d4cd45332be4bb9666c5842621986eb7890cfc23eb7dd"} Apr 16 16:49:29.557600 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.557556 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" event={"ID":"5720f2ab-ce1b-4958-97fb-6899db6301a3","Type":"ContainerStarted","Data":"9c2778f93c890c8c8032ece7b7fc3b1402778748965760c333dd11d7e593217c"} Apr 16 16:49:29.557600 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.557584 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" event={"ID":"5720f2ab-ce1b-4958-97fb-6899db6301a3","Type":"ContainerStarted","Data":"cb397d22af39ed7997b0b1874ad325ace2dd8dd6f277789c3a9b04fe9c539e71"} Apr 16 16:49:29.557600 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.557598 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" event={"ID":"5720f2ab-ce1b-4958-97fb-6899db6301a3","Type":"ContainerStarted","Data":"b958793b726db7615cbaa8d8a99ac0f832e1ff6430f578a5fa234bd2a7c9c64d"} Apr 16 16:49:29.557843 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.557789 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:29.559037 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.558998 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" event={"ID":"ba3481af-decf-4c56-b396-527c1e009a30","Type":"ContainerStarted","Data":"a6452692f1a448d713d5350522bd0e186e39d92467ae0e708691ddfdb6b57a86"} Apr 16 16:49:29.560531 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.560513 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerStarted","Data":"f1ac0b213d882e09b7b98ca3a67cadd9ea4fc54267fed79e168a0682ab2478cf"} Apr 16 16:49:29.561689 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.561565 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j58zk" event={"ID":"2224a16f-0022-488d-a462-4d5ab10cb7bb","Type":"ContainerStarted","Data":"a652b6e75ccc22135a0d794fa0fa835d18d12d1c888537cf52c69f195a020e65"} Apr 16 16:49:29.562644 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.562505 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j58zk" Apr 16 16:49:29.564488 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.564467 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" event={"ID":"3dda195a-5e85-4304-a192-8af66801e3e5","Type":"ContainerStarted","Data":"5ad28d8bab6373ed5803962b9fcf976b054cc79df6797941dcd751cf5ab8dcc6"} Apr 16 16:49:29.564585 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.564496 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" event={"ID":"3dda195a-5e85-4304-a192-8af66801e3e5","Type":"ContainerStarted","Data":"5c05ff60fc3273fdb64e61c50b7e67343118da18ee9e65f011e815f25500ca39"} Apr 16 16:49:29.564585 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.564511 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" event={"ID":"3dda195a-5e85-4304-a192-8af66801e3e5","Type":"ContainerStarted","Data":"3ff0fec56b6fdfc2a6c55af409f584996ccc668bbc86a56c858c11761c9ef4ca"} Apr 16 16:49:29.566141 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.566120 2574 generic.go:358] "Generic (PLEG): container finished" podID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerID="9ae52ad2e3f5c5663dcd066a71080fac2564d53bd9347a0b84b3aeea77870c8a" exitCode=0 Apr 16 16:49:29.566490 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.566471 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerDied","Data":"9ae52ad2e3f5c5663dcd066a71080fac2564d53bd9347a0b84b3aeea77870c8a"} Apr 16 16:49:29.566566 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.566500 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerStarted","Data":"10e122c116ea0c8478ae1b2000ea1608f196e51b5abe9e5400b501b81047fa5b"} Apr 16 16:49:29.567432 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.567411 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j58zk" Apr 16 16:49:29.574079 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.574042 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7zrbv" podStartSLOduration=6.177266049 podStartE2EDuration="9.574030385s" podCreationTimestamp="2026-04-16 16:49:20 +0000 UTC" firstStartedPulling="2026-04-16 16:49:21.886267317 +0000 UTC m=+44.169292878" lastFinishedPulling="2026-04-16 16:49:25.283031651 +0000 UTC m=+47.566057214" observedRunningTime="2026-04-16 16:49:29.572805527 +0000 UTC m=+51.855831103" watchObservedRunningTime="2026-04-16 16:49:29.574030385 +0000 UTC m=+51.857055962" Apr 16 16:49:29.621693 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.621505 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" podStartSLOduration=2.427175073 podStartE2EDuration="5.621490523s" podCreationTimestamp="2026-04-16 16:49:24 +0000 UTC" firstStartedPulling="2026-04-16 16:49:25.466279275 +0000 UTC m=+47.749304845" lastFinishedPulling="2026-04-16 16:49:28.660594725 +0000 UTC m=+50.943620295" observedRunningTime="2026-04-16 16:49:29.620079081 +0000 UTC m=+51.903104660" watchObservedRunningTime="2026-04-16 16:49:29.621490523 +0000 UTC m=+51.904516099" Apr 16 16:49:29.643109 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.643061 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" podStartSLOduration=1.5481732369999999 podStartE2EDuration="7.643044319s" podCreationTimestamp="2026-04-16 16:49:22 +0000 UTC" firstStartedPulling="2026-04-16 16:49:22.566474409 +0000 UTC m=+44.849499964" lastFinishedPulling="2026-04-16 16:49:28.661345476 +0000 UTC m=+50.944371046" observedRunningTime="2026-04-16 16:49:29.640353378 +0000 UTC m=+51.923378951" watchObservedRunningTime="2026-04-16 16:49:29.643044319 +0000 UTC m=+51.926069897" Apr 16 16:49:29.656799 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.656752 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j58zk" podStartSLOduration=2.472105693 podStartE2EDuration="5.656735371s" podCreationTimestamp="2026-04-16 16:49:24 +0000 UTC" firstStartedPulling="2026-04-16 16:49:25.475408812 +0000 UTC m=+47.758434371" lastFinishedPulling="2026-04-16 16:49:28.66003848 +0000 UTC m=+50.943064049" observedRunningTime="2026-04-16 16:49:29.655138422 +0000 UTC m=+51.938164000" watchObservedRunningTime="2026-04-16 16:49:29.656735371 +0000 UTC m=+51.939760951" Apr 16 16:49:29.677458 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:29.677410 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-f9bf64dc9-4xqt7" podStartSLOduration=1.78434013 podStartE2EDuration="4.677393845s" podCreationTimestamp="2026-04-16 16:49:25 +0000 UTC" firstStartedPulling="2026-04-16 16:49:25.767542748 +0000 UTC m=+48.050568318" lastFinishedPulling="2026-04-16 16:49:28.660596456 +0000 UTC m=+50.943622033" observedRunningTime="2026-04-16 16:49:29.675982272 +0000 UTC m=+51.959007852" watchObservedRunningTime="2026-04-16 16:49:29.677393845 +0000 UTC m=+51.960419424" Apr 16 16:49:30.574157 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:30.574114 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerStarted","Data":"327963c4f11c3522af19b95adc5ea71d10c4092d38e63ff0db5f493b6c345d88"} Apr 16 16:49:30.574686 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:30.574164 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerStarted","Data":"d9056ed942012456225c30e9619979ecc5b3be00639903d39816baf6ef78803c"} Apr 16 16:49:30.574686 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:30.574178 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerStarted","Data":"2359a923ef3b4ece9373f7c0a534ca740a886ac659c40aa6b5d4c7b4f3283ed2"} Apr 16 16:49:30.574686 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:30.574192 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerStarted","Data":"ecfe4a1955e77f02e6775ac9b7087028d085a80477cc5481836ab81c2d5291f1"} Apr 16 16:49:30.574686 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:30.574207 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerStarted","Data":"5410f25e0aeb0840b718dda47ca5d6699c76919ab2c6c99040bb4eba3be10dc0"} Apr 16 16:49:30.581339 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:30.581318 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-848496fcb-44nmq" Apr 16 16:49:30.602422 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:30.602293 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.503640014 podStartE2EDuration="9.602280313s" podCreationTimestamp="2026-04-16 16:49:21 +0000 UTC" firstStartedPulling="2026-04-16 16:49:22.275949664 +0000 UTC m=+44.558975220" lastFinishedPulling="2026-04-16 16:49:29.374589949 +0000 UTC m=+51.657615519" observedRunningTime="2026-04-16 16:49:30.60127857 +0000 UTC m=+52.884304147" watchObservedRunningTime="2026-04-16 16:49:30.602280313 +0000 UTC m=+52.885305889" Apr 16 16:49:33.587108 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:33.587074 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerStarted","Data":"7773ed2eb8520f7cd9a4c26847b9905dc5b62d21dbf6abe5e4e4b8f692efc887"} Apr 16 16:49:33.587108 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:33.587110 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerStarted","Data":"3a715a7f1f2f89fb0680edfcc0993b777dff04c2b038a06ee2a6ad4162a3cebf"} Apr 16 16:49:33.587592 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:33.587119 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerStarted","Data":"2859051e617ad0483565c550d3ec7cddca22d6a24a3eda2e1792f2ac5189428f"} Apr 16 16:49:33.587592 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:33.587129 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerStarted","Data":"0bf0b314a40a882dbf1476b0766478e8528e198dc29c08bd787a60bd945b31cf"} Apr 16 16:49:33.587592 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:33.587136 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerStarted","Data":"2b569a7eb68848ef627c35c9ce57f3021e090d691f68fc192a0b753b9ae2533f"} Apr 16 16:49:33.587592 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:33.587144 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerStarted","Data":"59f458f77bee271a309522c90b8cf37c4fdb04878cbd21ac4f491d26f08d6bb5"} Apr 16 16:49:33.619882 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:33.619834 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.607721104 podStartE2EDuration="7.619818645s" podCreationTimestamp="2026-04-16 16:49:26 +0000 UTC" firstStartedPulling="2026-04-16 16:49:29.568265873 +0000 UTC m=+51.851291429" lastFinishedPulling="2026-04-16 16:49:32.580363413 +0000 UTC m=+54.863388970" observedRunningTime="2026-04-16 16:49:33.616923046 +0000 UTC m=+55.899948624" watchObservedRunningTime="2026-04-16 16:49:33.619818645 +0000 UTC m=+55.902844261" Apr 16 16:49:36.420276 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:36.420249 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2fwqq" Apr 16 16:49:36.625476 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:36.625441 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:49:42.107674 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:42.107627 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b7857647d-xnzkf"] Apr 16 16:49:43.967182 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:43.967138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs\") pod \"network-metrics-daemon-62nv6\" (UID: \"7c7749f8-7b64-4062-9bca-90c0826a9692\") " pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:43.970463 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:43.970439 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:49:43.981476 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:43.981454 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c7749f8-7b64-4062-9bca-90c0826a9692-metrics-certs\") pod \"network-metrics-daemon-62nv6\" (UID: \"7c7749f8-7b64-4062-9bca-90c0826a9692\") " pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:44.168292 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.168259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsw8x\" (UniqueName: \"kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x\") pod \"network-check-target-cqm4l\" (UID: \"0b5fcafd-70a7-4e76-ba7a-022cfee37811\") " pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:44.171748 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.171726 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:49:44.181527 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.181502 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v7sx7\"" Apr 16 16:49:44.181619 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.181539 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:49:44.189329 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.189308 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-62nv6" Apr 16 16:49:44.191531 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.191501 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsw8x\" (UniqueName: \"kubernetes.io/projected/0b5fcafd-70a7-4e76-ba7a-022cfee37811-kube-api-access-hsw8x\") pod \"network-check-target-cqm4l\" (UID: \"0b5fcafd-70a7-4e76-ba7a-022cfee37811\") " pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:44.286174 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.286112 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mkc4m\"" Apr 16 16:49:44.293688 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.293649 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:44.303579 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.303550 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-62nv6"] Apr 16 16:49:44.306916 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:44.306890 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c7749f8_7b64_4062_9bca_90c0826a9692.slice/crio-c3e33384a8d4a27dae21c166677df6dc29ba95f6e6856bc0d1aac08441b858c7 WatchSource:0}: Error finding container c3e33384a8d4a27dae21c166677df6dc29ba95f6e6856bc0d1aac08441b858c7: Status 404 returned error can't find the container with id c3e33384a8d4a27dae21c166677df6dc29ba95f6e6856bc0d1aac08441b858c7 Apr 16 16:49:44.408034 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.408014 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cqm4l"] Apr 16 16:49:44.409990 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:49:44.409957 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b5fcafd_70a7_4e76_ba7a_022cfee37811.slice/crio-b3a76330da6e279cfef6a189ffe423980e63fb89603b1d3b3e7e4e5e1d3209b1 WatchSource:0}: Error finding container b3a76330da6e279cfef6a189ffe423980e63fb89603b1d3b3e7e4e5e1d3209b1: Status 404 returned error can't find the container with id b3a76330da6e279cfef6a189ffe423980e63fb89603b1d3b3e7e4e5e1d3209b1 Apr 16 16:49:44.623992 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.623956 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-62nv6" event={"ID":"7c7749f8-7b64-4062-9bca-90c0826a9692","Type":"ContainerStarted","Data":"c3e33384a8d4a27dae21c166677df6dc29ba95f6e6856bc0d1aac08441b858c7"} Apr 16 16:49:44.624868 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.624841 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cqm4l" event={"ID":"0b5fcafd-70a7-4e76-ba7a-022cfee37811","Type":"ContainerStarted","Data":"b3a76330da6e279cfef6a189ffe423980e63fb89603b1d3b3e7e4e5e1d3209b1"} Apr 16 16:49:44.854068 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.854034 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:44.854300 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:44.854084 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:49:46.637214 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:46.637173 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-62nv6" event={"ID":"7c7749f8-7b64-4062-9bca-90c0826a9692","Type":"ContainerStarted","Data":"991fc5c27372df56dd44e9af2b0d56de1b74e64d2883cf2845a1a7515e248d67"} Apr 16 16:49:46.637214 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:46.637218 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-62nv6" event={"ID":"7c7749f8-7b64-4062-9bca-90c0826a9692","Type":"ContainerStarted","Data":"ff357d201b1c2e440bbbadfa5802a7b1b27c80f48e824dac4d4695aae8e88be6"} Apr 16 16:49:46.661413 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:46.661365 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-62nv6" podStartSLOduration=67.244740348 podStartE2EDuration="1m8.661349688s" podCreationTimestamp="2026-04-16 16:48:38 +0000 UTC" firstStartedPulling="2026-04-16 16:49:44.309148247 +0000 UTC m=+66.592173802" lastFinishedPulling="2026-04-16 16:49:45.725757572 +0000 UTC m=+68.008783142" observedRunningTime="2026-04-16 16:49:46.659130197 +0000 UTC m=+68.942155775" watchObservedRunningTime="2026-04-16 16:49:46.661349688 +0000 UTC m=+68.944375265" Apr 16 16:49:48.649297 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:48.649258 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cqm4l" event={"ID":"0b5fcafd-70a7-4e76-ba7a-022cfee37811","Type":"ContainerStarted","Data":"f2d2634d6e8b8b97aeda7c601316a8590cd7694fbd363e215cfe17447a68b191"} Apr 16 16:49:48.649717 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:48.649377 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:49:48.669221 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:49:48.669173 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cqm4l" podStartSLOduration=67.503423146 podStartE2EDuration="1m10.66916096s" podCreationTimestamp="2026-04-16 16:48:38 +0000 UTC" firstStartedPulling="2026-04-16 16:49:44.411858903 +0000 UTC m=+66.694884462" lastFinishedPulling="2026-04-16 16:49:47.577596717 +0000 UTC m=+69.860622276" observedRunningTime="2026-04-16 16:49:48.667486906 +0000 UTC m=+70.950512483" watchObservedRunningTime="2026-04-16 16:49:48.66916096 +0000 UTC m=+70.952186561" Apr 16 16:50:04.859979 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:04.859950 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:50:04.863641 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:04.863619 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-79468b9bf5-qh26g" Apr 16 16:50:06.440977 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:06.440946 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:06.458838 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:06.458814 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:06.639948 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:06.639920 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:07.131666 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.131600 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b7857647d-xnzkf" podUID="ecad2307-e422-4189-8656-d67ce0f7448b" containerName="console" containerID="cri-o://b6269a23874ba7b693fe913d83c5293ce8cc43b6e262f4987f3acb25e7a9d8ab" gracePeriod=15 Apr 16 16:50:07.364170 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.364151 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b7857647d-xnzkf_ecad2307-e422-4189-8656-d67ce0f7448b/console/0.log" Apr 16 16:50:07.364275 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.364209 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:50:07.481946 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.481863 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-trusted-ca-bundle\") pod \"ecad2307-e422-4189-8656-d67ce0f7448b\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " Apr 16 16:50:07.481946 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.481926 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecad2307-e422-4189-8656-d67ce0f7448b-console-oauth-config\") pod \"ecad2307-e422-4189-8656-d67ce0f7448b\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " Apr 16 16:50:07.482412 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.481953 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecad2307-e422-4189-8656-d67ce0f7448b-console-serving-cert\") pod \"ecad2307-e422-4189-8656-d67ce0f7448b\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " Apr 16 16:50:07.482412 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.481990 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-console-config\") pod \"ecad2307-e422-4189-8656-d67ce0f7448b\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " Apr 16 16:50:07.482412 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.482029 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prnsz\" (UniqueName: \"kubernetes.io/projected/ecad2307-e422-4189-8656-d67ce0f7448b-kube-api-access-prnsz\") pod \"ecad2307-e422-4189-8656-d67ce0f7448b\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " Apr 16 16:50:07.482412 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.482085 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-oauth-serving-cert\") pod \"ecad2307-e422-4189-8656-d67ce0f7448b\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " Apr 16 16:50:07.482412 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.482131 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-service-ca\") pod \"ecad2307-e422-4189-8656-d67ce0f7448b\" (UID: \"ecad2307-e422-4189-8656-d67ce0f7448b\") " Apr 16 16:50:07.482703 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.482427 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ecad2307-e422-4189-8656-d67ce0f7448b" (UID: "ecad2307-e422-4189-8656-d67ce0f7448b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:07.482703 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.482455 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-console-config" (OuterVolumeSpecName: "console-config") pod "ecad2307-e422-4189-8656-d67ce0f7448b" (UID: "ecad2307-e422-4189-8656-d67ce0f7448b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:07.482703 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.482502 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ecad2307-e422-4189-8656-d67ce0f7448b" (UID: "ecad2307-e422-4189-8656-d67ce0f7448b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:07.482860 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.482788 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-service-ca" (OuterVolumeSpecName: "service-ca") pod "ecad2307-e422-4189-8656-d67ce0f7448b" (UID: "ecad2307-e422-4189-8656-d67ce0f7448b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:07.484295 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.484264 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecad2307-e422-4189-8656-d67ce0f7448b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ecad2307-e422-4189-8656-d67ce0f7448b" (UID: "ecad2307-e422-4189-8656-d67ce0f7448b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:07.484396 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.484328 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecad2307-e422-4189-8656-d67ce0f7448b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ecad2307-e422-4189-8656-d67ce0f7448b" (UID: "ecad2307-e422-4189-8656-d67ce0f7448b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:07.484396 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.484337 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecad2307-e422-4189-8656-d67ce0f7448b-kube-api-access-prnsz" (OuterVolumeSpecName: "kube-api-access-prnsz") pod "ecad2307-e422-4189-8656-d67ce0f7448b" (UID: "ecad2307-e422-4189-8656-d67ce0f7448b"). InnerVolumeSpecName "kube-api-access-prnsz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:50:07.583787 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.583751 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-console-config\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:07.583787 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.583782 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-prnsz\" (UniqueName: \"kubernetes.io/projected/ecad2307-e422-4189-8656-d67ce0f7448b-kube-api-access-prnsz\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:07.583787 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.583793 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-oauth-serving-cert\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:07.583967 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.583803 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-service-ca\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:07.583967 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.583811 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecad2307-e422-4189-8656-d67ce0f7448b-trusted-ca-bundle\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:07.583967 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.583821 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecad2307-e422-4189-8656-d67ce0f7448b-console-oauth-config\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:07.583967 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.583829 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecad2307-e422-4189-8656-d67ce0f7448b-console-serving-cert\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:07.703930 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.703903 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b7857647d-xnzkf_ecad2307-e422-4189-8656-d67ce0f7448b/console/0.log" Apr 16 16:50:07.704105 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.703940 2574 generic.go:358] "Generic (PLEG): container finished" podID="ecad2307-e422-4189-8656-d67ce0f7448b" containerID="b6269a23874ba7b693fe913d83c5293ce8cc43b6e262f4987f3acb25e7a9d8ab" exitCode=2 Apr 16 16:50:07.704105 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.704021 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b7857647d-xnzkf" Apr 16 16:50:07.704105 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.704029 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b7857647d-xnzkf" event={"ID":"ecad2307-e422-4189-8656-d67ce0f7448b","Type":"ContainerDied","Data":"b6269a23874ba7b693fe913d83c5293ce8cc43b6e262f4987f3acb25e7a9d8ab"} Apr 16 16:50:07.704105 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.704062 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b7857647d-xnzkf" event={"ID":"ecad2307-e422-4189-8656-d67ce0f7448b","Type":"ContainerDied","Data":"109b91111ab712631d423c54e483e39c4bd5748b69e8b48706eacae7d962bf3c"} Apr 16 16:50:07.704105 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.704078 2574 scope.go:117] "RemoveContainer" containerID="b6269a23874ba7b693fe913d83c5293ce8cc43b6e262f4987f3acb25e7a9d8ab" Apr 16 16:50:07.712459 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.712442 2574 scope.go:117] "RemoveContainer" containerID="b6269a23874ba7b693fe913d83c5293ce8cc43b6e262f4987f3acb25e7a9d8ab" Apr 16 16:50:07.712720 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:07.712702 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6269a23874ba7b693fe913d83c5293ce8cc43b6e262f4987f3acb25e7a9d8ab\": container with ID starting with b6269a23874ba7b693fe913d83c5293ce8cc43b6e262f4987f3acb25e7a9d8ab not found: ID does not exist" containerID="b6269a23874ba7b693fe913d83c5293ce8cc43b6e262f4987f3acb25e7a9d8ab" Apr 16 16:50:07.712775 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.712728 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6269a23874ba7b693fe913d83c5293ce8cc43b6e262f4987f3acb25e7a9d8ab"} err="failed to get container status \"b6269a23874ba7b693fe913d83c5293ce8cc43b6e262f4987f3acb25e7a9d8ab\": rpc error: code = NotFound desc = could not find container \"b6269a23874ba7b693fe913d83c5293ce8cc43b6e262f4987f3acb25e7a9d8ab\": container with ID starting with b6269a23874ba7b693fe913d83c5293ce8cc43b6e262f4987f3acb25e7a9d8ab not found: ID does not exist" Apr 16 16:50:07.726626 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.726604 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b7857647d-xnzkf"] Apr 16 16:50:07.729976 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:07.729955 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b7857647d-xnzkf"] Apr 16 16:50:08.274585 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:08.274547 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecad2307-e422-4189-8656-d67ce0f7448b" path="/var/lib/kubelet/pods/ecad2307-e422-4189-8656-d67ce0f7448b/volumes" Apr 16 16:50:19.655247 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:19.655218 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cqm4l" Apr 16 16:50:20.500314 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.500283 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:50:20.500910 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.500879 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="alertmanager" containerID="cri-o://f1ac0b213d882e09b7b98ca3a67cadd9ea4fc54267fed79e168a0682ab2478cf" gracePeriod=120 Apr 16 16:50:20.501018 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.500931 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="kube-rbac-proxy-metric" containerID="cri-o://d9056ed942012456225c30e9619979ecc5b3be00639903d39816baf6ef78803c" gracePeriod=120 Apr 16 16:50:20.501018 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.500945 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="prom-label-proxy" containerID="cri-o://327963c4f11c3522af19b95adc5ea71d10c4092d38e63ff0db5f493b6c345d88" gracePeriod=120 Apr 16 16:50:20.501018 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.500975 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="kube-rbac-proxy-web" containerID="cri-o://ecfe4a1955e77f02e6775ac9b7087028d085a80477cc5481836ab81c2d5291f1" gracePeriod=120 Apr 16 16:50:20.501018 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.500994 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="kube-rbac-proxy" containerID="cri-o://2359a923ef3b4ece9373f7c0a534ca740a886ac659c40aa6b5d4c7b4f3283ed2" gracePeriod=120 Apr 16 16:50:20.501267 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.501021 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="config-reloader" containerID="cri-o://5410f25e0aeb0840b718dda47ca5d6699c76919ab2c6c99040bb4eba3be10dc0" gracePeriod=120 Apr 16 16:50:20.746168 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.746138 2574 generic.go:358] "Generic (PLEG): container finished" podID="2baf6227-4885-49ec-a971-4601fc362ec0" containerID="327963c4f11c3522af19b95adc5ea71d10c4092d38e63ff0db5f493b6c345d88" exitCode=0 Apr 16 16:50:20.746168 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.746162 2574 generic.go:358] "Generic (PLEG): container finished" podID="2baf6227-4885-49ec-a971-4601fc362ec0" containerID="d9056ed942012456225c30e9619979ecc5b3be00639903d39816baf6ef78803c" exitCode=0 Apr 16 16:50:20.746168 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.746168 2574 generic.go:358] "Generic (PLEG): container finished" podID="2baf6227-4885-49ec-a971-4601fc362ec0" containerID="2359a923ef3b4ece9373f7c0a534ca740a886ac659c40aa6b5d4c7b4f3283ed2" exitCode=0 Apr 16 16:50:20.746168 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.746174 2574 generic.go:358] "Generic (PLEG): container finished" podID="2baf6227-4885-49ec-a971-4601fc362ec0" containerID="5410f25e0aeb0840b718dda47ca5d6699c76919ab2c6c99040bb4eba3be10dc0" exitCode=0 Apr 16 16:50:20.746168 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.746179 2574 generic.go:358] "Generic (PLEG): container finished" podID="2baf6227-4885-49ec-a971-4601fc362ec0" containerID="f1ac0b213d882e09b7b98ca3a67cadd9ea4fc54267fed79e168a0682ab2478cf" exitCode=0 Apr 16 16:50:20.746668 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.746216 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerDied","Data":"327963c4f11c3522af19b95adc5ea71d10c4092d38e63ff0db5f493b6c345d88"} Apr 16 16:50:20.746668 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.746254 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerDied","Data":"d9056ed942012456225c30e9619979ecc5b3be00639903d39816baf6ef78803c"} Apr 16 16:50:20.746668 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.746265 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerDied","Data":"2359a923ef3b4ece9373f7c0a534ca740a886ac659c40aa6b5d4c7b4f3283ed2"} Apr 16 16:50:20.746668 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.746274 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerDied","Data":"5410f25e0aeb0840b718dda47ca5d6699c76919ab2c6c99040bb4eba3be10dc0"} Apr 16 16:50:20.746668 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:20.746283 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerDied","Data":"f1ac0b213d882e09b7b98ca3a67cadd9ea4fc54267fed79e168a0682ab2478cf"} Apr 16 16:50:21.737349 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.737323 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:21.750945 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.750921 2574 generic.go:358] "Generic (PLEG): container finished" podID="2baf6227-4885-49ec-a971-4601fc362ec0" containerID="ecfe4a1955e77f02e6775ac9b7087028d085a80477cc5481836ab81c2d5291f1" exitCode=0 Apr 16 16:50:21.751243 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.751002 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerDied","Data":"ecfe4a1955e77f02e6775ac9b7087028d085a80477cc5481836ab81c2d5291f1"} Apr 16 16:50:21.751243 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.751042 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:21.751243 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.751045 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2baf6227-4885-49ec-a971-4601fc362ec0","Type":"ContainerDied","Data":"3bf999ee219a35d06892991d09352423b7f52583123d619e28b7d14697fd195b"} Apr 16 16:50:21.751243 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.751067 2574 scope.go:117] "RemoveContainer" containerID="327963c4f11c3522af19b95adc5ea71d10c4092d38e63ff0db5f493b6c345d88" Apr 16 16:50:21.758293 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.758278 2574 scope.go:117] "RemoveContainer" containerID="d9056ed942012456225c30e9619979ecc5b3be00639903d39816baf6ef78803c" Apr 16 16:50:21.764859 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.764838 2574 scope.go:117] "RemoveContainer" containerID="2359a923ef3b4ece9373f7c0a534ca740a886ac659c40aa6b5d4c7b4f3283ed2" Apr 16 16:50:21.773426 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.773409 2574 scope.go:117] "RemoveContainer" containerID="ecfe4a1955e77f02e6775ac9b7087028d085a80477cc5481836ab81c2d5291f1" Apr 16 16:50:21.780511 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.780491 2574 scope.go:117] "RemoveContainer" containerID="5410f25e0aeb0840b718dda47ca5d6699c76919ab2c6c99040bb4eba3be10dc0" Apr 16 16:50:21.787151 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787136 2574 scope.go:117] "RemoveContainer" containerID="f1ac0b213d882e09b7b98ca3a67cadd9ea4fc54267fed79e168a0682ab2478cf" Apr 16 16:50:21.787455 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787438 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2baf6227-4885-49ec-a971-4601fc362ec0-config-out\") pod \"2baf6227-4885-49ec-a971-4601fc362ec0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " Apr 16 16:50:21.787538 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787471 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqssm\" (UniqueName: \"kubernetes.io/projected/2baf6227-4885-49ec-a971-4601fc362ec0-kube-api-access-nqssm\") pod \"2baf6227-4885-49ec-a971-4601fc362ec0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " Apr 16 16:50:21.787538 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787498 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2baf6227-4885-49ec-a971-4601fc362ec0-metrics-client-ca\") pod \"2baf6227-4885-49ec-a971-4601fc362ec0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " Apr 16 16:50:21.787538 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787525 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-cluster-tls-config\") pod \"2baf6227-4885-49ec-a971-4601fc362ec0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " Apr 16 16:50:21.787712 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787641 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"2baf6227-4885-49ec-a971-4601fc362ec0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " Apr 16 16:50:21.787770 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787716 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy\") pod \"2baf6227-4885-49ec-a971-4601fc362ec0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " Apr 16 16:50:21.787770 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787761 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2baf6227-4885-49ec-a971-4601fc362ec0-alertmanager-trusted-ca-bundle\") pod \"2baf6227-4885-49ec-a971-4601fc362ec0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " Apr 16 16:50:21.787880 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787786 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy-web\") pod \"2baf6227-4885-49ec-a971-4601fc362ec0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " Apr 16 16:50:21.787880 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787845 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-web-config\") pod \"2baf6227-4885-49ec-a971-4601fc362ec0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " Apr 16 16:50:21.787984 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787879 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-main-tls\") pod \"2baf6227-4885-49ec-a971-4601fc362ec0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " Apr 16 16:50:21.787984 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787918 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2baf6227-4885-49ec-a971-4601fc362ec0-alertmanager-main-db\") pod \"2baf6227-4885-49ec-a971-4601fc362ec0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " Apr 16 16:50:21.787984 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787943 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-config-volume\") pod \"2baf6227-4885-49ec-a971-4601fc362ec0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " Apr 16 16:50:21.787984 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.787977 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2baf6227-4885-49ec-a971-4601fc362ec0-tls-assets\") pod \"2baf6227-4885-49ec-a971-4601fc362ec0\" (UID: \"2baf6227-4885-49ec-a971-4601fc362ec0\") " Apr 16 16:50:21.788165 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.788123 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2baf6227-4885-49ec-a971-4601fc362ec0-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "2baf6227-4885-49ec-a971-4601fc362ec0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:21.788276 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.788253 2574 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2baf6227-4885-49ec-a971-4601fc362ec0-metrics-client-ca\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:21.788339 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.788289 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2baf6227-4885-49ec-a971-4601fc362ec0-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "2baf6227-4885-49ec-a971-4601fc362ec0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:50:21.788836 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.788679 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2baf6227-4885-49ec-a971-4601fc362ec0-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "2baf6227-4885-49ec-a971-4601fc362ec0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:21.792540 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.792494 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2baf6227-4885-49ec-a971-4601fc362ec0-config-out" (OuterVolumeSpecName: "config-out") pod "2baf6227-4885-49ec-a971-4601fc362ec0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:50:21.792540 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.792517 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2baf6227-4885-49ec-a971-4601fc362ec0-kube-api-access-nqssm" (OuterVolumeSpecName: "kube-api-access-nqssm") pod "2baf6227-4885-49ec-a971-4601fc362ec0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0"). InnerVolumeSpecName "kube-api-access-nqssm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:50:21.792540 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.792525 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-config-volume" (OuterVolumeSpecName: "config-volume") pod "2baf6227-4885-49ec-a971-4601fc362ec0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:21.792925 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.792889 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "2baf6227-4885-49ec-a971-4601fc362ec0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:21.793088 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.793064 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "2baf6227-4885-49ec-a971-4601fc362ec0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:21.793194 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.793112 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "2baf6227-4885-49ec-a971-4601fc362ec0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:21.793442 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.793419 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2baf6227-4885-49ec-a971-4601fc362ec0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2baf6227-4885-49ec-a971-4601fc362ec0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:50:21.794357 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.794332 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "2baf6227-4885-49ec-a971-4601fc362ec0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:21.796380 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.796229 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "2baf6227-4885-49ec-a971-4601fc362ec0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:21.803424 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.803403 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-web-config" (OuterVolumeSpecName: "web-config") pod "2baf6227-4885-49ec-a971-4601fc362ec0" (UID: "2baf6227-4885-49ec-a971-4601fc362ec0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:21.810615 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.810596 2574 scope.go:117] "RemoveContainer" containerID="0cf82074f1ff0b09f0dc6e58bf68fe362d3e78c848c42c434e509f67b496dc36" Apr 16 16:50:21.817831 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.817803 2574 scope.go:117] "RemoveContainer" containerID="327963c4f11c3522af19b95adc5ea71d10c4092d38e63ff0db5f493b6c345d88" Apr 16 16:50:21.818082 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:21.818056 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"327963c4f11c3522af19b95adc5ea71d10c4092d38e63ff0db5f493b6c345d88\": container with ID starting with 327963c4f11c3522af19b95adc5ea71d10c4092d38e63ff0db5f493b6c345d88 not found: ID does not exist" containerID="327963c4f11c3522af19b95adc5ea71d10c4092d38e63ff0db5f493b6c345d88" Apr 16 16:50:21.818141 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.818096 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327963c4f11c3522af19b95adc5ea71d10c4092d38e63ff0db5f493b6c345d88"} err="failed to get container status \"327963c4f11c3522af19b95adc5ea71d10c4092d38e63ff0db5f493b6c345d88\": rpc error: code = NotFound desc = could not find container \"327963c4f11c3522af19b95adc5ea71d10c4092d38e63ff0db5f493b6c345d88\": container with ID starting with 327963c4f11c3522af19b95adc5ea71d10c4092d38e63ff0db5f493b6c345d88 not found: ID does not exist" Apr 16 16:50:21.818141 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.818125 2574 scope.go:117] "RemoveContainer" containerID="d9056ed942012456225c30e9619979ecc5b3be00639903d39816baf6ef78803c" Apr 16 16:50:21.818378 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:21.818359 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9056ed942012456225c30e9619979ecc5b3be00639903d39816baf6ef78803c\": container with ID starting with d9056ed942012456225c30e9619979ecc5b3be00639903d39816baf6ef78803c not found: ID does not exist" containerID="d9056ed942012456225c30e9619979ecc5b3be00639903d39816baf6ef78803c" Apr 16 16:50:21.818446 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.818391 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9056ed942012456225c30e9619979ecc5b3be00639903d39816baf6ef78803c"} err="failed to get container status \"d9056ed942012456225c30e9619979ecc5b3be00639903d39816baf6ef78803c\": rpc error: code = NotFound desc = could not find container \"d9056ed942012456225c30e9619979ecc5b3be00639903d39816baf6ef78803c\": container with ID starting with d9056ed942012456225c30e9619979ecc5b3be00639903d39816baf6ef78803c not found: ID does not exist" Apr 16 16:50:21.818446 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.818409 2574 scope.go:117] "RemoveContainer" containerID="2359a923ef3b4ece9373f7c0a534ca740a886ac659c40aa6b5d4c7b4f3283ed2" Apr 16 16:50:21.818650 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:21.818628 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2359a923ef3b4ece9373f7c0a534ca740a886ac659c40aa6b5d4c7b4f3283ed2\": container with ID starting with 2359a923ef3b4ece9373f7c0a534ca740a886ac659c40aa6b5d4c7b4f3283ed2 not found: ID does not exist" containerID="2359a923ef3b4ece9373f7c0a534ca740a886ac659c40aa6b5d4c7b4f3283ed2" Apr 16 16:50:21.818724 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.818686 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2359a923ef3b4ece9373f7c0a534ca740a886ac659c40aa6b5d4c7b4f3283ed2"} err="failed to get container status \"2359a923ef3b4ece9373f7c0a534ca740a886ac659c40aa6b5d4c7b4f3283ed2\": rpc error: code = NotFound desc = could not find container \"2359a923ef3b4ece9373f7c0a534ca740a886ac659c40aa6b5d4c7b4f3283ed2\": container with ID starting with 2359a923ef3b4ece9373f7c0a534ca740a886ac659c40aa6b5d4c7b4f3283ed2 not found: ID does not exist" Apr 16 16:50:21.818724 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.818712 2574 scope.go:117] "RemoveContainer" containerID="ecfe4a1955e77f02e6775ac9b7087028d085a80477cc5481836ab81c2d5291f1" Apr 16 16:50:21.818951 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:21.818937 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecfe4a1955e77f02e6775ac9b7087028d085a80477cc5481836ab81c2d5291f1\": container with ID starting with ecfe4a1955e77f02e6775ac9b7087028d085a80477cc5481836ab81c2d5291f1 not found: ID does not exist" containerID="ecfe4a1955e77f02e6775ac9b7087028d085a80477cc5481836ab81c2d5291f1" Apr 16 16:50:21.818991 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.818954 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecfe4a1955e77f02e6775ac9b7087028d085a80477cc5481836ab81c2d5291f1"} err="failed to get container status \"ecfe4a1955e77f02e6775ac9b7087028d085a80477cc5481836ab81c2d5291f1\": rpc error: code = NotFound desc = could not find container \"ecfe4a1955e77f02e6775ac9b7087028d085a80477cc5481836ab81c2d5291f1\": container with ID starting with ecfe4a1955e77f02e6775ac9b7087028d085a80477cc5481836ab81c2d5291f1 not found: ID does not exist" Apr 16 16:50:21.818991 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.818967 2574 scope.go:117] "RemoveContainer" containerID="5410f25e0aeb0840b718dda47ca5d6699c76919ab2c6c99040bb4eba3be10dc0" Apr 16 16:50:21.819181 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:21.819166 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5410f25e0aeb0840b718dda47ca5d6699c76919ab2c6c99040bb4eba3be10dc0\": container with ID starting with 5410f25e0aeb0840b718dda47ca5d6699c76919ab2c6c99040bb4eba3be10dc0 not found: ID does not exist" containerID="5410f25e0aeb0840b718dda47ca5d6699c76919ab2c6c99040bb4eba3be10dc0" Apr 16 16:50:21.819221 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.819184 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5410f25e0aeb0840b718dda47ca5d6699c76919ab2c6c99040bb4eba3be10dc0"} err="failed to get container status \"5410f25e0aeb0840b718dda47ca5d6699c76919ab2c6c99040bb4eba3be10dc0\": rpc error: code = NotFound desc = could not find container \"5410f25e0aeb0840b718dda47ca5d6699c76919ab2c6c99040bb4eba3be10dc0\": container with ID starting with 5410f25e0aeb0840b718dda47ca5d6699c76919ab2c6c99040bb4eba3be10dc0 not found: ID does not exist" Apr 16 16:50:21.819221 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.819197 2574 scope.go:117] "RemoveContainer" containerID="f1ac0b213d882e09b7b98ca3a67cadd9ea4fc54267fed79e168a0682ab2478cf" Apr 16 16:50:21.819423 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:21.819408 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1ac0b213d882e09b7b98ca3a67cadd9ea4fc54267fed79e168a0682ab2478cf\": container with ID starting with f1ac0b213d882e09b7b98ca3a67cadd9ea4fc54267fed79e168a0682ab2478cf not found: ID does not exist" containerID="f1ac0b213d882e09b7b98ca3a67cadd9ea4fc54267fed79e168a0682ab2478cf" Apr 16 16:50:21.819459 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.819429 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1ac0b213d882e09b7b98ca3a67cadd9ea4fc54267fed79e168a0682ab2478cf"} err="failed to get container status \"f1ac0b213d882e09b7b98ca3a67cadd9ea4fc54267fed79e168a0682ab2478cf\": rpc error: code = NotFound desc = could not find container \"f1ac0b213d882e09b7b98ca3a67cadd9ea4fc54267fed79e168a0682ab2478cf\": container with ID starting with f1ac0b213d882e09b7b98ca3a67cadd9ea4fc54267fed79e168a0682ab2478cf not found: ID does not exist" Apr 16 16:50:21.819459 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.819442 2574 scope.go:117] "RemoveContainer" containerID="0cf82074f1ff0b09f0dc6e58bf68fe362d3e78c848c42c434e509f67b496dc36" Apr 16 16:50:21.819723 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:21.819642 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf82074f1ff0b09f0dc6e58bf68fe362d3e78c848c42c434e509f67b496dc36\": container with ID starting with 0cf82074f1ff0b09f0dc6e58bf68fe362d3e78c848c42c434e509f67b496dc36 not found: ID does not exist" containerID="0cf82074f1ff0b09f0dc6e58bf68fe362d3e78c848c42c434e509f67b496dc36" Apr 16 16:50:21.819820 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.819726 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf82074f1ff0b09f0dc6e58bf68fe362d3e78c848c42c434e509f67b496dc36"} err="failed to get container status \"0cf82074f1ff0b09f0dc6e58bf68fe362d3e78c848c42c434e509f67b496dc36\": rpc error: code = NotFound desc = could not find container \"0cf82074f1ff0b09f0dc6e58bf68fe362d3e78c848c42c434e509f67b496dc36\": container with ID starting with 0cf82074f1ff0b09f0dc6e58bf68fe362d3e78c848c42c434e509f67b496dc36 not found: ID does not exist" Apr 16 16:50:21.888969 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.888889 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-web-config\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:21.888969 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.888917 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-main-tls\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:21.888969 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.888929 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2baf6227-4885-49ec-a971-4601fc362ec0-alertmanager-main-db\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:21.888969 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.888938 2574 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-config-volume\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:21.888969 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.888949 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2baf6227-4885-49ec-a971-4601fc362ec0-tls-assets\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:21.888969 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.888961 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2baf6227-4885-49ec-a971-4601fc362ec0-config-out\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:21.889245 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.888976 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nqssm\" (UniqueName: \"kubernetes.io/projected/2baf6227-4885-49ec-a971-4601fc362ec0-kube-api-access-nqssm\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:21.889245 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.888986 2574 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-cluster-tls-config\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:21.889245 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.888996 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:21.889245 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.889005 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:21.889245 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.889015 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2baf6227-4885-49ec-a971-4601fc362ec0-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:21.889245 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:21.889024 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2baf6227-4885-49ec-a971-4601fc362ec0-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:22.074492 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.074461 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:50:22.085270 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.085229 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:50:22.112835 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.112800 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:50:22.113412 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.113393 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="init-config-reloader" Apr 16 16:50:22.113531 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.113521 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="init-config-reloader" Apr 16 16:50:22.113611 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.113602 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="alertmanager" Apr 16 16:50:22.113698 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.113689 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="alertmanager" Apr 16 16:50:22.113782 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.113772 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="kube-rbac-proxy-metric" Apr 16 16:50:22.113855 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.113846 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="kube-rbac-proxy-metric" Apr 16 16:50:22.113925 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.113917 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="kube-rbac-proxy-web" Apr 16 16:50:22.113990 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.113982 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="kube-rbac-proxy-web" Apr 16 16:50:22.114070 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.114061 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="prom-label-proxy" Apr 16 16:50:22.114135 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.114127 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="prom-label-proxy" Apr 16 16:50:22.114206 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.114198 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="config-reloader" Apr 16 16:50:22.114274 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.114266 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="config-reloader" Apr 16 16:50:22.114340 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.114332 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="kube-rbac-proxy" Apr 16 16:50:22.114399 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.114391 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="kube-rbac-proxy" Apr 16 16:50:22.114467 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.114459 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecad2307-e422-4189-8656-d67ce0f7448b" containerName="console" Apr 16 16:50:22.114531 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.114523 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecad2307-e422-4189-8656-d67ce0f7448b" containerName="console" Apr 16 16:50:22.114689 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.114679 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="kube-rbac-proxy" Apr 16 16:50:22.114785 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.114777 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="prom-label-proxy" Apr 16 16:50:22.114858 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.114850 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="alertmanager" Apr 16 16:50:22.114927 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.114920 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="kube-rbac-proxy-metric" Apr 16 16:50:22.114993 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.114985 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="kube-rbac-proxy-web" Apr 16 16:50:22.115062 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.115054 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" containerName="config-reloader" Apr 16 16:50:22.115127 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.115119 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecad2307-e422-4189-8656-d67ce0f7448b" containerName="console" Apr 16 16:50:22.121942 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.121925 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.125771 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.125497 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 16:50:22.125771 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.125544 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 16:50:22.125771 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.125603 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 16:50:22.125771 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.125498 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 16:50:22.125771 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.125499 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-rd9s4\"" Apr 16 16:50:22.126290 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.126273 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 16:50:22.126403 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.126282 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 16:50:22.126467 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.126327 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 16:50:22.127219 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.127199 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:50:22.127426 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.127405 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 16:50:22.131625 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.131604 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 16:50:22.190937 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.190868 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.190937 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.190896 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-config-out\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.190937 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.190915 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.190937 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.190933 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk2tx\" (UniqueName: \"kubernetes.io/projected/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-kube-api-access-rk2tx\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.191213 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.190949 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.191213 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.190964 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.191213 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.190979 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.191213 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.190994 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.191213 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.191014 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.191213 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.191031 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.191213 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.191125 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-web-config\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.191213 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.191181 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.191213 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.191212 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-config-volume\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.274029 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.274004 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2baf6227-4885-49ec-a971-4601fc362ec0" path="/var/lib/kubelet/pods/2baf6227-4885-49ec-a971-4601fc362ec0/volumes" Apr 16 16:50:22.291700 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.291681 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.291789 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.291708 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-config-volume\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.291789 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.291762 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.291852 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.291792 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-config-out\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.291852 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.291815 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.291942 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.291852 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rk2tx\" (UniqueName: \"kubernetes.io/projected/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-kube-api-access-rk2tx\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.291942 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.291878 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.291942 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.291903 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.292399 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.292373 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.292567 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.292549 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.292727 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.292707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.292880 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.292862 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.293024 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.293007 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-web-config\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.293314 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.293285 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.294946 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.294639 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.294946 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.294740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.294946 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.294816 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-config-volume\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.294946 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.294921 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.294946 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.294937 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.294946 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.294946 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-config-out\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.295305 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.294987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.295305 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.295205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.295634 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.295611 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-web-config\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.295885 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.295867 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.297160 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.297143 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.300396 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.300380 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk2tx\" (UniqueName: \"kubernetes.io/projected/bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7-kube-api-access-rk2tx\") pod \"alertmanager-main-0\" (UID: \"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.435134 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.435108 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:22.553409 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.553258 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:50:22.555688 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:50:22.555642 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc5dcfd6_23d2_4b36_8e6f_37b73cc6dcf7.slice/crio-ae22f26410e579352d5d416d130a22e49c4cba94d06596b025aae6ac47fbdf9b WatchSource:0}: Error finding container ae22f26410e579352d5d416d130a22e49c4cba94d06596b025aae6ac47fbdf9b: Status 404 returned error can't find the container with id ae22f26410e579352d5d416d130a22e49c4cba94d06596b025aae6ac47fbdf9b Apr 16 16:50:22.756622 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.756541 2574 generic.go:358] "Generic (PLEG): container finished" podID="bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7" containerID="0d3600daccb8562e48fb0ba76a29116e1b0617c4fbc721b8dbaa60b33ea6eae8" exitCode=0 Apr 16 16:50:22.757044 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.756726 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7","Type":"ContainerDied","Data":"0d3600daccb8562e48fb0ba76a29116e1b0617c4fbc721b8dbaa60b33ea6eae8"} Apr 16 16:50:22.757044 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:22.756768 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7","Type":"ContainerStarted","Data":"ae22f26410e579352d5d416d130a22e49c4cba94d06596b025aae6ac47fbdf9b"} Apr 16 16:50:23.764708 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:23.764677 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7","Type":"ContainerStarted","Data":"d025894d400439c1e34f85b657a2c594abfdcb7739c8bff51ab3182de27c2ccd"} Apr 16 16:50:23.764708 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:23.764712 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7","Type":"ContainerStarted","Data":"d823777b8b16ad91e1a7c34c6a059b1d345d3b03832f9f5e2587196e463fd09a"} Apr 16 16:50:23.765144 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:23.764722 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7","Type":"ContainerStarted","Data":"ebd4d3015ddec227a610d43eaf2c1a4a75d57309f568739a7804832780f2a714"} Apr 16 16:50:23.765144 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:23.764731 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7","Type":"ContainerStarted","Data":"d2438a4eb14869ca3afe1365b439efefd46034b1c13b8900742d840985be2d96"} Apr 16 16:50:23.765144 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:23.764741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7","Type":"ContainerStarted","Data":"bc5dca2a6ec13c874f02dc57e597c9116157e01771496053e6644e5da796d814"} Apr 16 16:50:23.765144 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:23.764749 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7","Type":"ContainerStarted","Data":"3590c94e8fa0395a1dd9531a8250a5516a5c24e0c9d65c72fb357d591ea00e19"} Apr 16 16:50:23.799698 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:23.799089 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.799070062 podStartE2EDuration="1.799070062s" podCreationTimestamp="2026-04-16 16:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:50:23.795356215 +0000 UTC m=+106.078381793" watchObservedRunningTime="2026-04-16 16:50:23.799070062 +0000 UTC m=+106.082095641" Apr 16 16:50:24.792098 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:24.792056 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:50:24.793044 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:24.792978 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="kube-rbac-proxy-thanos" containerID="cri-o://7773ed2eb8520f7cd9a4c26847b9905dc5b62d21dbf6abe5e4e4b8f692efc887" gracePeriod=600 Apr 16 16:50:24.793396 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:24.793371 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="kube-rbac-proxy" containerID="cri-o://3a715a7f1f2f89fb0680edfcc0993b777dff04c2b038a06ee2a6ad4162a3cebf" gracePeriod=600 Apr 16 16:50:24.793913 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:24.793576 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="thanos-sidecar" containerID="cri-o://0bf0b314a40a882dbf1476b0766478e8528e198dc29c08bd787a60bd945b31cf" gracePeriod=600 Apr 16 16:50:24.793913 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:24.793670 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="kube-rbac-proxy-web" containerID="cri-o://2859051e617ad0483565c550d3ec7cddca22d6a24a3eda2e1792f2ac5189428f" gracePeriod=600 Apr 16 16:50:24.793913 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:24.793751 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="config-reloader" containerID="cri-o://2b569a7eb68848ef627c35c9ce57f3021e090d691f68fc192a0b753b9ae2533f" gracePeriod=600 Apr 16 16:50:24.793913 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:24.793825 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="prometheus" containerID="cri-o://59f458f77bee271a309522c90b8cf37c4fdb04878cbd21ac4f491d26f08d6bb5" gracePeriod=600 Apr 16 16:50:25.777017 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:25.776986 2574 generic.go:358] "Generic (PLEG): container finished" podID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerID="7773ed2eb8520f7cd9a4c26847b9905dc5b62d21dbf6abe5e4e4b8f692efc887" exitCode=0 Apr 16 16:50:25.777017 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:25.777010 2574 generic.go:358] "Generic (PLEG): container finished" podID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerID="3a715a7f1f2f89fb0680edfcc0993b777dff04c2b038a06ee2a6ad4162a3cebf" exitCode=0 Apr 16 16:50:25.777017 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:25.777016 2574 generic.go:358] "Generic (PLEG): container finished" podID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerID="0bf0b314a40a882dbf1476b0766478e8528e198dc29c08bd787a60bd945b31cf" exitCode=0 Apr 16 16:50:25.777017 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:25.777022 2574 generic.go:358] "Generic (PLEG): container finished" podID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerID="2b569a7eb68848ef627c35c9ce57f3021e090d691f68fc192a0b753b9ae2533f" exitCode=0 Apr 16 16:50:25.777017 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:25.777028 2574 generic.go:358] "Generic (PLEG): container finished" podID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerID="59f458f77bee271a309522c90b8cf37c4fdb04878cbd21ac4f491d26f08d6bb5" exitCode=0 Apr 16 16:50:25.777391 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:25.777042 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerDied","Data":"7773ed2eb8520f7cd9a4c26847b9905dc5b62d21dbf6abe5e4e4b8f692efc887"} Apr 16 16:50:25.777391 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:25.777093 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerDied","Data":"3a715a7f1f2f89fb0680edfcc0993b777dff04c2b038a06ee2a6ad4162a3cebf"} Apr 16 16:50:25.777391 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:25.777107 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerDied","Data":"0bf0b314a40a882dbf1476b0766478e8528e198dc29c08bd787a60bd945b31cf"} Apr 16 16:50:25.777391 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:25.777119 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerDied","Data":"2b569a7eb68848ef627c35c9ce57f3021e090d691f68fc192a0b753b9ae2533f"} Apr 16 16:50:25.777391 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:25.777131 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerDied","Data":"59f458f77bee271a309522c90b8cf37c4fdb04878cbd21ac4f491d26f08d6bb5"} Apr 16 16:50:26.059069 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.059046 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.133946 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/197ed074-ec82-48e8-9e15-b998bc2e8e53-config-out\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134003 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-k8s-rulefiles-0\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134050 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/197ed074-ec82-48e8-9e15-b998bc2e8e53-tls-assets\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134090 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134119 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-kubelet-serving-ca-bundle\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134161 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-metrics-client-certs\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134192 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-web-config\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134225 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-thanos-prometheus-http-client-file\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134256 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-k8s-db\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134292 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-config\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134337 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-tls\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134370 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9spjt\" (UniqueName: \"kubernetes.io/projected/197ed074-ec82-48e8-9e15-b998bc2e8e53-kube-api-access-9spjt\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134416 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134451 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-serving-certs-ca-bundle\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134492 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-grpc-tls\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134518 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-metrics-client-ca\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134549 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-kube-rbac-proxy\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.134699 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.134577 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-trusted-ca-bundle\") pod \"197ed074-ec82-48e8-9e15-b998bc2e8e53\" (UID: \"197ed074-ec82-48e8-9e15-b998bc2e8e53\") " Apr 16 16:50:26.137749 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.136017 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:26.137749 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.136781 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:26.137749 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.137118 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:50:26.138967 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.138063 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:26.138967 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.138071 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:26.140366 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.140316 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-config" (OuterVolumeSpecName: "config") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:26.142155 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.142123 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:26.145479 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.145443 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:26.147395 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.147359 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/197ed074-ec82-48e8-9e15-b998bc2e8e53-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:50:26.147715 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.147676 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:26.147936 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.147909 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:26.148036 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.147930 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:26.148036 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.147951 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:26.148036 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.147991 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:26.148209 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.148167 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/197ed074-ec82-48e8-9e15-b998bc2e8e53-config-out" (OuterVolumeSpecName: "config-out") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:50:26.148940 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.148907 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/197ed074-ec82-48e8-9e15-b998bc2e8e53-kube-api-access-9spjt" (OuterVolumeSpecName: "kube-api-access-9spjt") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "kube-api-access-9spjt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:50:26.149555 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.149526 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:26.160518 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.160496 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-web-config" (OuterVolumeSpecName: "web-config") pod "197ed074-ec82-48e8-9e15-b998bc2e8e53" (UID: "197ed074-ec82-48e8-9e15-b998bc2e8e53"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:26.236296 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236257 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-config\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.236483 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236303 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-tls\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.236483 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236323 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9spjt\" (UniqueName: \"kubernetes.io/projected/197ed074-ec82-48e8-9e15-b998bc2e8e53-kube-api-access-9spjt\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.236483 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236342 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.236483 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236360 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.236483 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236378 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-grpc-tls\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.236483 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236396 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-metrics-client-ca\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.236483 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236412 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-kube-rbac-proxy\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.236483 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236428 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-trusted-ca-bundle\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.236483 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236442 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/197ed074-ec82-48e8-9e15-b998bc2e8e53-config-out\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.236483 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236458 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.236483 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236473 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/197ed074-ec82-48e8-9e15-b998bc2e8e53-tls-assets\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.236483 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236488 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.237136 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236504 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197ed074-ec82-48e8-9e15-b998bc2e8e53-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.237136 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236520 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-secret-metrics-client-certs\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.237136 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236537 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-web-config\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.237136 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236552 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/197ed074-ec82-48e8-9e15-b998bc2e8e53-thanos-prometheus-http-client-file\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.237136 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.236566 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/197ed074-ec82-48e8-9e15-b998bc2e8e53-prometheus-k8s-db\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:50:26.783587 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.783553 2574 generic.go:358] "Generic (PLEG): container finished" podID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerID="2859051e617ad0483565c550d3ec7cddca22d6a24a3eda2e1792f2ac5189428f" exitCode=0 Apr 16 16:50:26.783777 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.783631 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerDied","Data":"2859051e617ad0483565c550d3ec7cddca22d6a24a3eda2e1792f2ac5189428f"} Apr 16 16:50:26.783777 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.783663 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.783777 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.783684 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"197ed074-ec82-48e8-9e15-b998bc2e8e53","Type":"ContainerDied","Data":"10e122c116ea0c8478ae1b2000ea1608f196e51b5abe9e5400b501b81047fa5b"} Apr 16 16:50:26.783777 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.783702 2574 scope.go:117] "RemoveContainer" containerID="7773ed2eb8520f7cd9a4c26847b9905dc5b62d21dbf6abe5e4e4b8f692efc887" Apr 16 16:50:26.790795 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.790779 2574 scope.go:117] "RemoveContainer" containerID="3a715a7f1f2f89fb0680edfcc0993b777dff04c2b038a06ee2a6ad4162a3cebf" Apr 16 16:50:26.797665 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.797640 2574 scope.go:117] "RemoveContainer" containerID="2859051e617ad0483565c550d3ec7cddca22d6a24a3eda2e1792f2ac5189428f" Apr 16 16:50:26.803864 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.803842 2574 scope.go:117] "RemoveContainer" containerID="0bf0b314a40a882dbf1476b0766478e8528e198dc29c08bd787a60bd945b31cf" Apr 16 16:50:26.805711 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.805692 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:50:26.809478 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.809460 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:50:26.810825 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.810810 2574 scope.go:117] "RemoveContainer" containerID="2b569a7eb68848ef627c35c9ce57f3021e090d691f68fc192a0b753b9ae2533f" Apr 16 16:50:26.818790 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.818775 2574 scope.go:117] "RemoveContainer" containerID="59f458f77bee271a309522c90b8cf37c4fdb04878cbd21ac4f491d26f08d6bb5" Apr 16 16:50:26.825239 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.825221 2574 scope.go:117] "RemoveContainer" containerID="9ae52ad2e3f5c5663dcd066a71080fac2564d53bd9347a0b84b3aeea77870c8a" Apr 16 16:50:26.831475 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.831407 2574 scope.go:117] "RemoveContainer" containerID="7773ed2eb8520f7cd9a4c26847b9905dc5b62d21dbf6abe5e4e4b8f692efc887" Apr 16 16:50:26.831759 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:26.831735 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7773ed2eb8520f7cd9a4c26847b9905dc5b62d21dbf6abe5e4e4b8f692efc887\": container with ID starting with 7773ed2eb8520f7cd9a4c26847b9905dc5b62d21dbf6abe5e4e4b8f692efc887 not found: ID does not exist" containerID="7773ed2eb8520f7cd9a4c26847b9905dc5b62d21dbf6abe5e4e4b8f692efc887" Apr 16 16:50:26.831846 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.831772 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7773ed2eb8520f7cd9a4c26847b9905dc5b62d21dbf6abe5e4e4b8f692efc887"} err="failed to get container status \"7773ed2eb8520f7cd9a4c26847b9905dc5b62d21dbf6abe5e4e4b8f692efc887\": rpc error: code = NotFound desc = could not find container \"7773ed2eb8520f7cd9a4c26847b9905dc5b62d21dbf6abe5e4e4b8f692efc887\": container with ID starting with 7773ed2eb8520f7cd9a4c26847b9905dc5b62d21dbf6abe5e4e4b8f692efc887 not found: ID does not exist" Apr 16 16:50:26.831846 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.831796 2574 scope.go:117] "RemoveContainer" containerID="3a715a7f1f2f89fb0680edfcc0993b777dff04c2b038a06ee2a6ad4162a3cebf" Apr 16 16:50:26.832062 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:26.832012 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a715a7f1f2f89fb0680edfcc0993b777dff04c2b038a06ee2a6ad4162a3cebf\": container with ID starting with 3a715a7f1f2f89fb0680edfcc0993b777dff04c2b038a06ee2a6ad4162a3cebf not found: ID does not exist" containerID="3a715a7f1f2f89fb0680edfcc0993b777dff04c2b038a06ee2a6ad4162a3cebf" Apr 16 16:50:26.832127 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.832068 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a715a7f1f2f89fb0680edfcc0993b777dff04c2b038a06ee2a6ad4162a3cebf"} err="failed to get container status \"3a715a7f1f2f89fb0680edfcc0993b777dff04c2b038a06ee2a6ad4162a3cebf\": rpc error: code = NotFound desc = could not find container \"3a715a7f1f2f89fb0680edfcc0993b777dff04c2b038a06ee2a6ad4162a3cebf\": container with ID starting with 3a715a7f1f2f89fb0680edfcc0993b777dff04c2b038a06ee2a6ad4162a3cebf not found: ID does not exist" Apr 16 16:50:26.832127 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.832082 2574 scope.go:117] "RemoveContainer" containerID="2859051e617ad0483565c550d3ec7cddca22d6a24a3eda2e1792f2ac5189428f" Apr 16 16:50:26.832327 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:26.832303 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2859051e617ad0483565c550d3ec7cddca22d6a24a3eda2e1792f2ac5189428f\": container with ID starting with 2859051e617ad0483565c550d3ec7cddca22d6a24a3eda2e1792f2ac5189428f not found: ID does not exist" containerID="2859051e617ad0483565c550d3ec7cddca22d6a24a3eda2e1792f2ac5189428f" Apr 16 16:50:26.832429 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.832328 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2859051e617ad0483565c550d3ec7cddca22d6a24a3eda2e1792f2ac5189428f"} err="failed to get container status \"2859051e617ad0483565c550d3ec7cddca22d6a24a3eda2e1792f2ac5189428f\": rpc error: code = NotFound desc = could not find container \"2859051e617ad0483565c550d3ec7cddca22d6a24a3eda2e1792f2ac5189428f\": container with ID starting with 2859051e617ad0483565c550d3ec7cddca22d6a24a3eda2e1792f2ac5189428f not found: ID does not exist" Apr 16 16:50:26.832429 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.832345 2574 scope.go:117] "RemoveContainer" containerID="0bf0b314a40a882dbf1476b0766478e8528e198dc29c08bd787a60bd945b31cf" Apr 16 16:50:26.832580 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:26.832565 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf0b314a40a882dbf1476b0766478e8528e198dc29c08bd787a60bd945b31cf\": container with ID starting with 0bf0b314a40a882dbf1476b0766478e8528e198dc29c08bd787a60bd945b31cf not found: ID does not exist" containerID="0bf0b314a40a882dbf1476b0766478e8528e198dc29c08bd787a60bd945b31cf" Apr 16 16:50:26.832649 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.832583 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf0b314a40a882dbf1476b0766478e8528e198dc29c08bd787a60bd945b31cf"} err="failed to get container status \"0bf0b314a40a882dbf1476b0766478e8528e198dc29c08bd787a60bd945b31cf\": rpc error: code = NotFound desc = could not find container \"0bf0b314a40a882dbf1476b0766478e8528e198dc29c08bd787a60bd945b31cf\": container with ID starting with 0bf0b314a40a882dbf1476b0766478e8528e198dc29c08bd787a60bd945b31cf not found: ID does not exist" Apr 16 16:50:26.832649 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.832597 2574 scope.go:117] "RemoveContainer" containerID="2b569a7eb68848ef627c35c9ce57f3021e090d691f68fc192a0b753b9ae2533f" Apr 16 16:50:26.832858 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:26.832840 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b569a7eb68848ef627c35c9ce57f3021e090d691f68fc192a0b753b9ae2533f\": container with ID starting with 2b569a7eb68848ef627c35c9ce57f3021e090d691f68fc192a0b753b9ae2533f not found: ID does not exist" containerID="2b569a7eb68848ef627c35c9ce57f3021e090d691f68fc192a0b753b9ae2533f" Apr 16 16:50:26.832924 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.832862 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b569a7eb68848ef627c35c9ce57f3021e090d691f68fc192a0b753b9ae2533f"} err="failed to get container status \"2b569a7eb68848ef627c35c9ce57f3021e090d691f68fc192a0b753b9ae2533f\": rpc error: code = NotFound desc = could not find container \"2b569a7eb68848ef627c35c9ce57f3021e090d691f68fc192a0b753b9ae2533f\": container with ID starting with 2b569a7eb68848ef627c35c9ce57f3021e090d691f68fc192a0b753b9ae2533f not found: ID does not exist" Apr 16 16:50:26.832924 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.832880 2574 scope.go:117] "RemoveContainer" containerID="59f458f77bee271a309522c90b8cf37c4fdb04878cbd21ac4f491d26f08d6bb5" Apr 16 16:50:26.833173 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:26.833158 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f458f77bee271a309522c90b8cf37c4fdb04878cbd21ac4f491d26f08d6bb5\": container with ID starting with 59f458f77bee271a309522c90b8cf37c4fdb04878cbd21ac4f491d26f08d6bb5 not found: ID does not exist" containerID="59f458f77bee271a309522c90b8cf37c4fdb04878cbd21ac4f491d26f08d6bb5" Apr 16 16:50:26.833236 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833175 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f458f77bee271a309522c90b8cf37c4fdb04878cbd21ac4f491d26f08d6bb5"} err="failed to get container status \"59f458f77bee271a309522c90b8cf37c4fdb04878cbd21ac4f491d26f08d6bb5\": rpc error: code = NotFound desc = could not find container \"59f458f77bee271a309522c90b8cf37c4fdb04878cbd21ac4f491d26f08d6bb5\": container with ID starting with 59f458f77bee271a309522c90b8cf37c4fdb04878cbd21ac4f491d26f08d6bb5 not found: ID does not exist" Apr 16 16:50:26.833236 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833188 2574 scope.go:117] "RemoveContainer" containerID="9ae52ad2e3f5c5663dcd066a71080fac2564d53bd9347a0b84b3aeea77870c8a" Apr 16 16:50:26.833347 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833328 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:50:26.833417 ip-10-0-143-10 kubenswrapper[2574]: E0416 16:50:26.833402 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ae52ad2e3f5c5663dcd066a71080fac2564d53bd9347a0b84b3aeea77870c8a\": container with ID starting with 9ae52ad2e3f5c5663dcd066a71080fac2564d53bd9347a0b84b3aeea77870c8a not found: ID does not exist" containerID="9ae52ad2e3f5c5663dcd066a71080fac2564d53bd9347a0b84b3aeea77870c8a" Apr 16 16:50:26.833461 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833423 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae52ad2e3f5c5663dcd066a71080fac2564d53bd9347a0b84b3aeea77870c8a"} err="failed to get container status \"9ae52ad2e3f5c5663dcd066a71080fac2564d53bd9347a0b84b3aeea77870c8a\": rpc error: code = NotFound desc = could not find container \"9ae52ad2e3f5c5663dcd066a71080fac2564d53bd9347a0b84b3aeea77870c8a\": container with ID starting with 9ae52ad2e3f5c5663dcd066a71080fac2564d53bd9347a0b84b3aeea77870c8a not found: ID does not exist" Apr 16 16:50:26.833619 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833607 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="thanos-sidecar" Apr 16 16:50:26.833680 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833621 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="thanos-sidecar" Apr 16 16:50:26.833680 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833634 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="kube-rbac-proxy-thanos" Apr 16 16:50:26.833680 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833639 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="kube-rbac-proxy-thanos" Apr 16 16:50:26.833680 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833648 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="prometheus" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833686 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="prometheus" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833697 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="kube-rbac-proxy" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833702 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="kube-rbac-proxy" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833709 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="kube-rbac-proxy-web" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833715 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="kube-rbac-proxy-web" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833724 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="config-reloader" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833729 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="config-reloader" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833736 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="init-config-reloader" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833741 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="init-config-reloader" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833783 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="config-reloader" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833806 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="kube-rbac-proxy-web" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833812 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="prometheus" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833819 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="kube-rbac-proxy" Apr 16 16:50:26.833821 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833825 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="thanos-sidecar" Apr 16 16:50:26.834255 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.833838 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" containerName="kube-rbac-proxy-thanos" Apr 16 16:50:26.846804 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.846784 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.849475 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.849449 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:50:26.850137 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.849968 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 16:50:26.850137 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.850001 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 16:50:26.850137 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.850012 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 16:50:26.850137 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.850023 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 16:50:26.850137 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.850001 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jnhsj\"" Apr 16 16:50:26.850414 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.850203 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 16:50:26.850414 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.850220 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 16:50:26.850414 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.850237 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 16:50:26.850414 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.850306 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5cjngfjhoetgr\"" Apr 16 16:50:26.850414 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.850241 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 16:50:26.850414 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.850345 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 16:50:26.850698 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.850441 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 16:50:26.853577 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.853558 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 16:50:26.856017 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.855998 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 16:50:26.944102 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944071 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-web-config\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944252 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944113 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944252 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944133 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944252 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944160 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944252 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944193 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944252 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944228 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e8eacc20-f29a-4f10-81f6-b883b281b28e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944405 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944265 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944405 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944288 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944405 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944311 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944405 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944338 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944405 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944358 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944405 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944377 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e8eacc20-f29a-4f10-81f6-b883b281b28e-config-out\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944571 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944435 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-config\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944571 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944454 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pft7n\" (UniqueName: \"kubernetes.io/projected/e8eacc20-f29a-4f10-81f6-b883b281b28e-kube-api-access-pft7n\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944571 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944476 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e8eacc20-f29a-4f10-81f6-b883b281b28e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944571 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944524 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944571 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944553 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:26.944742 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:26.944597 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.045537 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045435 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.045537 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045495 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e8eacc20-f29a-4f10-81f6-b883b281b28e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.045537 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.045857 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045564 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.045857 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045586 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.045857 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045614 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.045857 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045642 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.045857 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e8eacc20-f29a-4f10-81f6-b883b281b28e-config-out\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.045857 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045740 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-config\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.045857 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045763 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pft7n\" (UniqueName: \"kubernetes.io/projected/e8eacc20-f29a-4f10-81f6-b883b281b28e-kube-api-access-pft7n\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.045857 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045796 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e8eacc20-f29a-4f10-81f6-b883b281b28e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.045857 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045824 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.045857 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.046323 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.046323 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045924 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-web-config\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.046323 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.045973 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.046323 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.046000 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.046323 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.046025 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.046567 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.046323 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.046949 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.046712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.047058 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.046960 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.048607 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.048582 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e8eacc20-f29a-4f10-81f6-b883b281b28e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.048870 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.048831 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.049933 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.049234 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.049933 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.049410 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-web-config\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.049933 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.049450 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.049933 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.049524 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8eacc20-f29a-4f10-81f6-b883b281b28e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.049933 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.049528 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.049933 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.049607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.049933 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.049829 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e8eacc20-f29a-4f10-81f6-b883b281b28e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.050387 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.050212 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.051119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.051100 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.051416 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.051398 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.051623 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.051610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8eacc20-f29a-4f10-81f6-b883b281b28e-config\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.051758 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.051742 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e8eacc20-f29a-4f10-81f6-b883b281b28e-config-out\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.059297 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.059279 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pft7n\" (UniqueName: \"kubernetes.io/projected/e8eacc20-f29a-4f10-81f6-b883b281b28e-kube-api-access-pft7n\") pod \"prometheus-k8s-0\" (UID: \"e8eacc20-f29a-4f10-81f6-b883b281b28e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.157861 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.157814 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:27.277375 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.277344 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:50:27.280090 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:50:27.280060 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8eacc20_f29a_4f10_81f6_b883b281b28e.slice/crio-afd35c1be9f9b74bb2428aeb40a07e548f80350699b9678ff14cf8d298822fee WatchSource:0}: Error finding container afd35c1be9f9b74bb2428aeb40a07e548f80350699b9678ff14cf8d298822fee: Status 404 returned error can't find the container with id afd35c1be9f9b74bb2428aeb40a07e548f80350699b9678ff14cf8d298822fee Apr 16 16:50:27.788609 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.788577 2574 generic.go:358] "Generic (PLEG): container finished" podID="e8eacc20-f29a-4f10-81f6-b883b281b28e" containerID="74f305868f53169b3b7dec9f19f76b1c40c365a347659b701c8ab9614f999721" exitCode=0 Apr 16 16:50:27.788779 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.788614 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8eacc20-f29a-4f10-81f6-b883b281b28e","Type":"ContainerDied","Data":"74f305868f53169b3b7dec9f19f76b1c40c365a347659b701c8ab9614f999721"} Apr 16 16:50:27.788779 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:27.788632 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8eacc20-f29a-4f10-81f6-b883b281b28e","Type":"ContainerStarted","Data":"afd35c1be9f9b74bb2428aeb40a07e548f80350699b9678ff14cf8d298822fee"} Apr 16 16:50:28.279091 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:28.279052 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="197ed074-ec82-48e8-9e15-b998bc2e8e53" path="/var/lib/kubelet/pods/197ed074-ec82-48e8-9e15-b998bc2e8e53/volumes" Apr 16 16:50:28.793921 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:28.793888 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8eacc20-f29a-4f10-81f6-b883b281b28e","Type":"ContainerStarted","Data":"914c7eb719f5e6186cc1691b5f6b26fcec393ecd0d1c5b845d26035121687b9e"} Apr 16 16:50:28.793921 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:28.793923 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8eacc20-f29a-4f10-81f6-b883b281b28e","Type":"ContainerStarted","Data":"0944dbfec460ce07edd051fa1707bb1efeee4de240ad3f3d9529d5019dad1301"} Apr 16 16:50:28.794120 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:28.793932 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8eacc20-f29a-4f10-81f6-b883b281b28e","Type":"ContainerStarted","Data":"ecc7f474b37b4664c512f224f726ce6cb122270a7bab5fb125ba9b32874714f9"} Apr 16 16:50:28.794120 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:28.793940 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8eacc20-f29a-4f10-81f6-b883b281b28e","Type":"ContainerStarted","Data":"dfc07d9f754fc73c085391dc26ce6a0d55a7066a9a759720a63a890b83466a32"} Apr 16 16:50:28.794120 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:28.793948 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8eacc20-f29a-4f10-81f6-b883b281b28e","Type":"ContainerStarted","Data":"8ba96625e098de563c7f09133f8483229dd2e18d751194d617e0a72c1ef4a17d"} Apr 16 16:50:28.794120 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:28.793958 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8eacc20-f29a-4f10-81f6-b883b281b28e","Type":"ContainerStarted","Data":"2d0e6b6c440f492e07899b20fef7ce262c3e390bc09dd0cb2fa232ffca105d34"} Apr 16 16:50:28.827843 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:28.827801 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.8277876109999998 podStartE2EDuration="2.827787611s" podCreationTimestamp="2026-04-16 16:50:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:50:28.822141941 +0000 UTC m=+111.105167534" watchObservedRunningTime="2026-04-16 16:50:28.827787611 +0000 UTC m=+111.110813188" Apr 16 16:50:32.159039 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:32.159005 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:54.948051 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:54.948013 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-f8hkt"] Apr 16 16:50:54.953492 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:54.953471 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f8hkt" Apr 16 16:50:54.956403 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:54.956387 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:50:54.959102 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:54.959077 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-f8hkt"] Apr 16 16:50:55.092316 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:55.092279 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/68f9da11-93da-44fa-ac64-d1075ee67259-kubelet-config\") pod \"global-pull-secret-syncer-f8hkt\" (UID: \"68f9da11-93da-44fa-ac64-d1075ee67259\") " pod="kube-system/global-pull-secret-syncer-f8hkt" Apr 16 16:50:55.092485 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:55.092356 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/68f9da11-93da-44fa-ac64-d1075ee67259-original-pull-secret\") pod \"global-pull-secret-syncer-f8hkt\" (UID: \"68f9da11-93da-44fa-ac64-d1075ee67259\") " pod="kube-system/global-pull-secret-syncer-f8hkt" Apr 16 16:50:55.092485 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:55.092389 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/68f9da11-93da-44fa-ac64-d1075ee67259-dbus\") pod \"global-pull-secret-syncer-f8hkt\" (UID: \"68f9da11-93da-44fa-ac64-d1075ee67259\") " pod="kube-system/global-pull-secret-syncer-f8hkt" Apr 16 16:50:55.193341 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:55.193309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/68f9da11-93da-44fa-ac64-d1075ee67259-dbus\") pod \"global-pull-secret-syncer-f8hkt\" (UID: \"68f9da11-93da-44fa-ac64-d1075ee67259\") " pod="kube-system/global-pull-secret-syncer-f8hkt" Apr 16 16:50:55.193341 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:55.193344 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/68f9da11-93da-44fa-ac64-d1075ee67259-kubelet-config\") pod \"global-pull-secret-syncer-f8hkt\" (UID: \"68f9da11-93da-44fa-ac64-d1075ee67259\") " pod="kube-system/global-pull-secret-syncer-f8hkt" Apr 16 16:50:55.193512 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:55.193424 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/68f9da11-93da-44fa-ac64-d1075ee67259-original-pull-secret\") pod \"global-pull-secret-syncer-f8hkt\" (UID: \"68f9da11-93da-44fa-ac64-d1075ee67259\") " pod="kube-system/global-pull-secret-syncer-f8hkt" Apr 16 16:50:55.193512 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:55.193498 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/68f9da11-93da-44fa-ac64-d1075ee67259-dbus\") pod \"global-pull-secret-syncer-f8hkt\" (UID: \"68f9da11-93da-44fa-ac64-d1075ee67259\") " pod="kube-system/global-pull-secret-syncer-f8hkt" Apr 16 16:50:55.193573 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:55.193503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/68f9da11-93da-44fa-ac64-d1075ee67259-kubelet-config\") pod \"global-pull-secret-syncer-f8hkt\" (UID: \"68f9da11-93da-44fa-ac64-d1075ee67259\") " pod="kube-system/global-pull-secret-syncer-f8hkt" Apr 16 16:50:55.195681 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:55.195648 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/68f9da11-93da-44fa-ac64-d1075ee67259-original-pull-secret\") pod \"global-pull-secret-syncer-f8hkt\" (UID: \"68f9da11-93da-44fa-ac64-d1075ee67259\") " pod="kube-system/global-pull-secret-syncer-f8hkt" Apr 16 16:50:55.263029 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:55.262935 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f8hkt" Apr 16 16:50:55.376545 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:55.376522 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-f8hkt"] Apr 16 16:50:55.378395 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:50:55.378369 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68f9da11_93da_44fa_ac64_d1075ee67259.slice/crio-6596ca1bab5aa10353e9486f1a0c1c15dc4aeca810d38b6c85ddc71042a43c21 WatchSource:0}: Error finding container 6596ca1bab5aa10353e9486f1a0c1c15dc4aeca810d38b6c85ddc71042a43c21: Status 404 returned error can't find the container with id 6596ca1bab5aa10353e9486f1a0c1c15dc4aeca810d38b6c85ddc71042a43c21 Apr 16 16:50:55.874621 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:55.874574 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-f8hkt" event={"ID":"68f9da11-93da-44fa-ac64-d1075ee67259","Type":"ContainerStarted","Data":"6596ca1bab5aa10353e9486f1a0c1c15dc4aeca810d38b6c85ddc71042a43c21"} Apr 16 16:50:59.887346 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:59.887308 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-f8hkt" event={"ID":"68f9da11-93da-44fa-ac64-d1075ee67259","Type":"ContainerStarted","Data":"ecbd6fb5b08f6f1cbc3b6320a135a398c9be1d2038339e082af4ab0b4ae6969c"} Apr 16 16:50:59.902920 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:50:59.902865 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-f8hkt" podStartSLOduration=2.199179305 podStartE2EDuration="5.902851806s" podCreationTimestamp="2026-04-16 16:50:54 +0000 UTC" firstStartedPulling="2026-04-16 16:50:55.379936006 +0000 UTC m=+137.662961562" lastFinishedPulling="2026-04-16 16:50:59.083608494 +0000 UTC m=+141.366634063" observedRunningTime="2026-04-16 16:50:59.90225933 +0000 UTC m=+142.185284908" watchObservedRunningTime="2026-04-16 16:50:59.902851806 +0000 UTC m=+142.185877383" Apr 16 16:51:27.158406 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:51:27.158330 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:51:27.173457 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:51:27.173431 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:51:27.987397 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:51:27.987370 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:52.019379 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.019344 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-xb542"] Apr 16 16:52:52.021848 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.021831 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-xb542" Apr 16 16:52:52.024689 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.024649 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 16:52:52.026155 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.026132 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 16:52:52.026268 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.026170 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-xh9pl\"" Apr 16 16:52:52.029496 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.029475 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-xb542"] Apr 16 16:52:52.061734 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.061710 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f42e787-d329-4068-befb-0f62052872ef-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-xb542\" (UID: \"7f42e787-d329-4068-befb-0f62052872ef\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xb542" Apr 16 16:52:52.061872 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.061755 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsbpt\" (UniqueName: \"kubernetes.io/projected/7f42e787-d329-4068-befb-0f62052872ef-kube-api-access-wsbpt\") pod \"cert-manager-webhook-597b96b99b-xb542\" (UID: \"7f42e787-d329-4068-befb-0f62052872ef\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xb542" Apr 16 16:52:52.162289 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.162263 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsbpt\" (UniqueName: \"kubernetes.io/projected/7f42e787-d329-4068-befb-0f62052872ef-kube-api-access-wsbpt\") pod \"cert-manager-webhook-597b96b99b-xb542\" (UID: \"7f42e787-d329-4068-befb-0f62052872ef\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xb542" Apr 16 16:52:52.162451 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.162333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f42e787-d329-4068-befb-0f62052872ef-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-xb542\" (UID: \"7f42e787-d329-4068-befb-0f62052872ef\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xb542" Apr 16 16:52:52.170677 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.170625 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f42e787-d329-4068-befb-0f62052872ef-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-xb542\" (UID: \"7f42e787-d329-4068-befb-0f62052872ef\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xb542" Apr 16 16:52:52.170783 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.170712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsbpt\" (UniqueName: \"kubernetes.io/projected/7f42e787-d329-4068-befb-0f62052872ef-kube-api-access-wsbpt\") pod \"cert-manager-webhook-597b96b99b-xb542\" (UID: \"7f42e787-d329-4068-befb-0f62052872ef\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xb542" Apr 16 16:52:52.345808 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.345777 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-xb542" Apr 16 16:52:52.480351 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:52.480324 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-xb542"] Apr 16 16:52:52.483297 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:52:52.483266 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f42e787_d329_4068_befb_0f62052872ef.slice/crio-78af09515dc6509fa5bc8bb934f25913da9735462f23fe6bb92b3b81a3e6a552 WatchSource:0}: Error finding container 78af09515dc6509fa5bc8bb934f25913da9735462f23fe6bb92b3b81a3e6a552: Status 404 returned error can't find the container with id 78af09515dc6509fa5bc8bb934f25913da9735462f23fe6bb92b3b81a3e6a552 Apr 16 16:52:53.039799 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:53.039768 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-dd6qr"] Apr 16 16:52:53.044430 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:53.044410 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-dd6qr" Apr 16 16:52:53.047473 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:53.047450 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-74sk9\"" Apr 16 16:52:53.050713 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:53.050690 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-dd6qr"] Apr 16 16:52:53.073826 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:53.073803 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c963a7ac-f65f-45e9-b9a6-6f2c20abe082-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-dd6qr\" (UID: \"c963a7ac-f65f-45e9-b9a6-6f2c20abe082\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-dd6qr" Apr 16 16:52:53.073958 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:53.073836 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dm7d\" (UniqueName: \"kubernetes.io/projected/c963a7ac-f65f-45e9-b9a6-6f2c20abe082-kube-api-access-8dm7d\") pod \"cert-manager-cainjector-8966b78d4-dd6qr\" (UID: \"c963a7ac-f65f-45e9-b9a6-6f2c20abe082\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-dd6qr" Apr 16 16:52:53.175200 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:53.175170 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c963a7ac-f65f-45e9-b9a6-6f2c20abe082-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-dd6qr\" (UID: \"c963a7ac-f65f-45e9-b9a6-6f2c20abe082\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-dd6qr" Apr 16 16:52:53.175200 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:53.175201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dm7d\" (UniqueName: \"kubernetes.io/projected/c963a7ac-f65f-45e9-b9a6-6f2c20abe082-kube-api-access-8dm7d\") pod \"cert-manager-cainjector-8966b78d4-dd6qr\" (UID: \"c963a7ac-f65f-45e9-b9a6-6f2c20abe082\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-dd6qr" Apr 16 16:52:53.183327 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:53.183297 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c963a7ac-f65f-45e9-b9a6-6f2c20abe082-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-dd6qr\" (UID: \"c963a7ac-f65f-45e9-b9a6-6f2c20abe082\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-dd6qr" Apr 16 16:52:53.183448 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:53.183430 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dm7d\" (UniqueName: \"kubernetes.io/projected/c963a7ac-f65f-45e9-b9a6-6f2c20abe082-kube-api-access-8dm7d\") pod \"cert-manager-cainjector-8966b78d4-dd6qr\" (UID: \"c963a7ac-f65f-45e9-b9a6-6f2c20abe082\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-dd6qr" Apr 16 16:52:53.209858 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:53.209832 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-xb542" event={"ID":"7f42e787-d329-4068-befb-0f62052872ef","Type":"ContainerStarted","Data":"78af09515dc6509fa5bc8bb934f25913da9735462f23fe6bb92b3b81a3e6a552"} Apr 16 16:52:53.354446 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:53.354411 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-dd6qr" Apr 16 16:52:53.498741 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:53.498684 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-dd6qr"] Apr 16 16:52:53.502416 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:52:53.502007 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc963a7ac_f65f_45e9_b9a6_6f2c20abe082.slice/crio-d070a007375d6dc80b59960bf413722c0f2987161cdcb5811e47cad21544e3b7 WatchSource:0}: Error finding container d070a007375d6dc80b59960bf413722c0f2987161cdcb5811e47cad21544e3b7: Status 404 returned error can't find the container with id d070a007375d6dc80b59960bf413722c0f2987161cdcb5811e47cad21544e3b7 Apr 16 16:52:54.214901 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:54.214866 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-dd6qr" event={"ID":"c963a7ac-f65f-45e9-b9a6-6f2c20abe082","Type":"ContainerStarted","Data":"d070a007375d6dc80b59960bf413722c0f2987161cdcb5811e47cad21544e3b7"} Apr 16 16:52:56.222123 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:56.222023 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-dd6qr" event={"ID":"c963a7ac-f65f-45e9-b9a6-6f2c20abe082","Type":"ContainerStarted","Data":"53f4fe51d88c34172c53dcd05a69f9f33ce0d9e84b5aeda0b1fc17eddac9d678"} Apr 16 16:52:56.223360 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:56.223334 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-xb542" event={"ID":"7f42e787-d329-4068-befb-0f62052872ef","Type":"ContainerStarted","Data":"e8004252c0c9dbaf4a3a4649a32172bf4ccc5fed026c08362e67042865d16649"} Apr 16 16:52:56.223470 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:56.223457 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-xb542" Apr 16 16:52:56.237986 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:56.237940 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-dd6qr" podStartSLOduration=0.787565938 podStartE2EDuration="3.237927193s" podCreationTimestamp="2026-04-16 16:52:53 +0000 UTC" firstStartedPulling="2026-04-16 16:52:53.504421855 +0000 UTC m=+255.787447410" lastFinishedPulling="2026-04-16 16:52:55.954783107 +0000 UTC m=+258.237808665" observedRunningTime="2026-04-16 16:52:56.236993278 +0000 UTC m=+258.520018856" watchObservedRunningTime="2026-04-16 16:52:56.237927193 +0000 UTC m=+258.520952770" Apr 16 16:52:56.254119 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:52:56.254070 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-xb542" podStartSLOduration=0.785073769 podStartE2EDuration="4.254055027s" podCreationTimestamp="2026-04-16 16:52:52 +0000 UTC" firstStartedPulling="2026-04-16 16:52:52.485371675 +0000 UTC m=+254.768397237" lastFinishedPulling="2026-04-16 16:52:55.954352921 +0000 UTC m=+258.237378495" observedRunningTime="2026-04-16 16:52:56.251846569 +0000 UTC m=+258.534872164" watchObservedRunningTime="2026-04-16 16:52:56.254055027 +0000 UTC m=+258.537080606" Apr 16 16:53:02.228735 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:02.228701 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-xb542" Apr 16 16:53:04.478388 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:04.478354 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-rhszf"] Apr 16 16:53:04.481095 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:04.481075 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-rhszf" Apr 16 16:53:04.483834 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:04.483814 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-74lq6\"" Apr 16 16:53:04.488834 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:04.488812 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-rhszf"] Apr 16 16:53:04.576267 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:04.576236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ebb4a1f-bef9-446d-b9b0-fb66cadcad11-bound-sa-token\") pod \"cert-manager-759f64656b-rhszf\" (UID: \"5ebb4a1f-bef9-446d-b9b0-fb66cadcad11\") " pod="cert-manager/cert-manager-759f64656b-rhszf" Apr 16 16:53:04.576418 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:04.576272 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbfhd\" (UniqueName: \"kubernetes.io/projected/5ebb4a1f-bef9-446d-b9b0-fb66cadcad11-kube-api-access-pbfhd\") pod \"cert-manager-759f64656b-rhszf\" (UID: \"5ebb4a1f-bef9-446d-b9b0-fb66cadcad11\") " pod="cert-manager/cert-manager-759f64656b-rhszf" Apr 16 16:53:04.676774 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:04.676737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ebb4a1f-bef9-446d-b9b0-fb66cadcad11-bound-sa-token\") pod \"cert-manager-759f64656b-rhszf\" (UID: \"5ebb4a1f-bef9-446d-b9b0-fb66cadcad11\") " pod="cert-manager/cert-manager-759f64656b-rhszf" Apr 16 16:53:04.677004 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:04.676985 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbfhd\" (UniqueName: \"kubernetes.io/projected/5ebb4a1f-bef9-446d-b9b0-fb66cadcad11-kube-api-access-pbfhd\") pod \"cert-manager-759f64656b-rhszf\" (UID: \"5ebb4a1f-bef9-446d-b9b0-fb66cadcad11\") " pod="cert-manager/cert-manager-759f64656b-rhszf" Apr 16 16:53:04.685384 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:04.685354 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ebb4a1f-bef9-446d-b9b0-fb66cadcad11-bound-sa-token\") pod \"cert-manager-759f64656b-rhszf\" (UID: \"5ebb4a1f-bef9-446d-b9b0-fb66cadcad11\") " pod="cert-manager/cert-manager-759f64656b-rhszf" Apr 16 16:53:04.685754 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:04.685736 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbfhd\" (UniqueName: \"kubernetes.io/projected/5ebb4a1f-bef9-446d-b9b0-fb66cadcad11-kube-api-access-pbfhd\") pod \"cert-manager-759f64656b-rhszf\" (UID: \"5ebb4a1f-bef9-446d-b9b0-fb66cadcad11\") " pod="cert-manager/cert-manager-759f64656b-rhszf" Apr 16 16:53:04.790426 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:04.790347 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-rhszf" Apr 16 16:53:04.908058 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:04.908037 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-rhszf"] Apr 16 16:53:04.910151 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:53:04.910126 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ebb4a1f_bef9_446d_b9b0_fb66cadcad11.slice/crio-fcd0c92024495de398fa12592f79fe9ea284c433293db970e1c9b57e88eb32b0 WatchSource:0}: Error finding container fcd0c92024495de398fa12592f79fe9ea284c433293db970e1c9b57e88eb32b0: Status 404 returned error can't find the container with id fcd0c92024495de398fa12592f79fe9ea284c433293db970e1c9b57e88eb32b0 Apr 16 16:53:05.249559 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:05.249525 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-rhszf" event={"ID":"5ebb4a1f-bef9-446d-b9b0-fb66cadcad11","Type":"ContainerStarted","Data":"9c4f52527486a368827afc23ac556efa87d2fa49efe8cf5fd425f5ad0723bdc1"} Apr 16 16:53:05.249559 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:05.249560 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-rhszf" event={"ID":"5ebb4a1f-bef9-446d-b9b0-fb66cadcad11","Type":"ContainerStarted","Data":"fcd0c92024495de398fa12592f79fe9ea284c433293db970e1c9b57e88eb32b0"} Apr 16 16:53:05.267231 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:05.267182 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-rhszf" podStartSLOduration=1.2671670320000001 podStartE2EDuration="1.267167032s" podCreationTimestamp="2026-04-16 16:53:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:53:05.266390927 +0000 UTC m=+267.549416504" watchObservedRunningTime="2026-04-16 16:53:05.267167032 +0000 UTC m=+267.550192609" Apr 16 16:53:38.153257 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:38.153225 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 16:53:38.153943 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:38.153925 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 16:53:38.160446 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:38.160429 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:53:41.057916 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.057878 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b"] Apr 16 16:53:41.060309 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.060295 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.063482 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.063462 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 16:53:41.063689 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.063675 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 16:53:41.065217 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.065197 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 16:53:41.065325 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.065280 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 16:53:41.065325 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.065299 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:53:41.065438 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.065329 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-cj4jz\"" Apr 16 16:53:41.079097 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.079075 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b"] Apr 16 16:53:41.080685 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.080650 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee-cert\") pod \"lws-controller-manager-65f5d85b79-d446b\" (UID: \"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.080771 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.080758 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee-metrics-cert\") pod \"lws-controller-manager-65f5d85b79-d446b\" (UID: \"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.080813 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.080785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee-manager-config\") pod \"lws-controller-manager-65f5d85b79-d446b\" (UID: \"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.080849 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.080824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8xm8\" (UniqueName: \"kubernetes.io/projected/1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee-kube-api-access-h8xm8\") pod \"lws-controller-manager-65f5d85b79-d446b\" (UID: \"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.181496 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.181461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee-manager-config\") pod \"lws-controller-manager-65f5d85b79-d446b\" (UID: \"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.181646 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.181533 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8xm8\" (UniqueName: \"kubernetes.io/projected/1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee-kube-api-access-h8xm8\") pod \"lws-controller-manager-65f5d85b79-d446b\" (UID: \"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.181646 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.181583 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee-cert\") pod \"lws-controller-manager-65f5d85b79-d446b\" (UID: \"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.181785 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.181646 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee-metrics-cert\") pod \"lws-controller-manager-65f5d85b79-d446b\" (UID: \"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.182142 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.182113 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee-manager-config\") pod \"lws-controller-manager-65f5d85b79-d446b\" (UID: \"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.184178 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.184152 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee-metrics-cert\") pod \"lws-controller-manager-65f5d85b79-d446b\" (UID: \"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.184276 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.184158 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee-cert\") pod \"lws-controller-manager-65f5d85b79-d446b\" (UID: \"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.193289 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.193266 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8xm8\" (UniqueName: \"kubernetes.io/projected/1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee-kube-api-access-h8xm8\") pod \"lws-controller-manager-65f5d85b79-d446b\" (UID: \"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee\") " pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.369497 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.369416 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:41.486761 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.486740 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b"] Apr 16 16:53:41.489351 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:53:41.489325 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c0b40c6_60f1_4cea_9c1d_ac6b2c78a3ee.slice/crio-38db2555eb5e3fa54bb4ee2db07795d4e5cadb792661e0b683b2e9fffc7a7987 WatchSource:0}: Error finding container 38db2555eb5e3fa54bb4ee2db07795d4e5cadb792661e0b683b2e9fffc7a7987: Status 404 returned error can't find the container with id 38db2555eb5e3fa54bb4ee2db07795d4e5cadb792661e0b683b2e9fffc7a7987 Apr 16 16:53:41.491214 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:41.491193 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:53:42.350462 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:42.350404 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" event={"ID":"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee","Type":"ContainerStarted","Data":"38db2555eb5e3fa54bb4ee2db07795d4e5cadb792661e0b683b2e9fffc7a7987"} Apr 16 16:53:44.360894 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:44.360862 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" event={"ID":"1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee","Type":"ContainerStarted","Data":"68fd0dac29f2b1fef3d7c876796f32aacad8c204c2d842be44da28818f558b0e"} Apr 16 16:53:44.361295 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:44.360974 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:53:44.379318 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:44.379271 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" podStartSLOduration=1.128513068 podStartE2EDuration="3.379257401s" podCreationTimestamp="2026-04-16 16:53:41 +0000 UTC" firstStartedPulling="2026-04-16 16:53:41.491331107 +0000 UTC m=+303.774356661" lastFinishedPulling="2026-04-16 16:53:43.742075439 +0000 UTC m=+306.025100994" observedRunningTime="2026-04-16 16:53:44.377151212 +0000 UTC m=+306.660176889" watchObservedRunningTime="2026-04-16 16:53:44.379257401 +0000 UTC m=+306.662282978" Apr 16 16:53:55.366878 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:53:55.366849 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-65f5d85b79-d446b" Apr 16 16:54:18.661814 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:18.661781 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-cwpbt"] Apr 16 16:54:18.668328 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:18.668303 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-cwpbt" Apr 16 16:54:18.671400 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:18.671371 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-mmkr9\"" Apr 16 16:54:18.671546 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:18.671371 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 16:54:18.671546 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:18.671373 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 16:54:18.675883 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:18.675846 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-cwpbt"] Apr 16 16:54:18.808193 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:18.808161 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stxgx\" (UniqueName: \"kubernetes.io/projected/f01a3ef9-1e51-4dec-bc39-b24e1f17f018-kube-api-access-stxgx\") pod \"limitador-operator-controller-manager-c7fb4c8d5-cwpbt\" (UID: \"f01a3ef9-1e51-4dec-bc39-b24e1f17f018\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-cwpbt" Apr 16 16:54:18.909588 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:18.909556 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stxgx\" (UniqueName: \"kubernetes.io/projected/f01a3ef9-1e51-4dec-bc39-b24e1f17f018-kube-api-access-stxgx\") pod \"limitador-operator-controller-manager-c7fb4c8d5-cwpbt\" (UID: \"f01a3ef9-1e51-4dec-bc39-b24e1f17f018\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-cwpbt" Apr 16 16:54:18.927088 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:18.927027 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stxgx\" (UniqueName: \"kubernetes.io/projected/f01a3ef9-1e51-4dec-bc39-b24e1f17f018-kube-api-access-stxgx\") pod \"limitador-operator-controller-manager-c7fb4c8d5-cwpbt\" (UID: \"f01a3ef9-1e51-4dec-bc39-b24e1f17f018\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-cwpbt" Apr 16 16:54:18.979867 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:18.979840 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-cwpbt" Apr 16 16:54:19.125828 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:19.125767 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-cwpbt"] Apr 16 16:54:19.128072 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:54:19.128045 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf01a3ef9_1e51_4dec_bc39_b24e1f17f018.slice/crio-aad2157b9842a836ae12a89e37be0879a6b9354a75a8f72b7775d3cc80d8c55a WatchSource:0}: Error finding container aad2157b9842a836ae12a89e37be0879a6b9354a75a8f72b7775d3cc80d8c55a: Status 404 returned error can't find the container with id aad2157b9842a836ae12a89e37be0879a6b9354a75a8f72b7775d3cc80d8c55a Apr 16 16:54:19.464985 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:19.464953 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-cwpbt" event={"ID":"f01a3ef9-1e51-4dec-bc39-b24e1f17f018","Type":"ContainerStarted","Data":"aad2157b9842a836ae12a89e37be0879a6b9354a75a8f72b7775d3cc80d8c55a"} Apr 16 16:54:22.475068 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:22.475034 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-cwpbt" event={"ID":"f01a3ef9-1e51-4dec-bc39-b24e1f17f018","Type":"ContainerStarted","Data":"d3d9f4f5a2dda844b82122400074912130e9e3c390e4766144d7c3c940ea68b7"} Apr 16 16:54:22.475452 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:22.475189 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-cwpbt" Apr 16 16:54:22.492433 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:22.492388 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-cwpbt" podStartSLOduration=2.052689093 podStartE2EDuration="4.492375577s" podCreationTimestamp="2026-04-16 16:54:18 +0000 UTC" firstStartedPulling="2026-04-16 16:54:19.130164507 +0000 UTC m=+341.413190078" lastFinishedPulling="2026-04-16 16:54:21.5698507 +0000 UTC m=+343.852876562" observedRunningTime="2026-04-16 16:54:22.49107465 +0000 UTC m=+344.774100227" watchObservedRunningTime="2026-04-16 16:54:22.492375577 +0000 UTC m=+344.775401157" Apr 16 16:54:32.525106 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.525017 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-75f64d5d79-lg9d9"] Apr 16 16:54:32.527564 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.527548 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.532153 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.532105 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 16:54:32.532301 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.532105 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sdzgj\"" Apr 16 16:54:32.532301 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.532286 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 16:54:32.532441 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.532144 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 16:54:32.532441 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.532151 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 16:54:32.532441 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.532167 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 16:54:32.532441 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.532141 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 16:54:32.532642 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.532399 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 16:54:32.537129 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.537108 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 16:54:32.541079 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.541056 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75f64d5d79-lg9d9"] Apr 16 16:54:32.618334 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.618297 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa23b615-dc8a-4c69-9454-d85ee27b097b-console-config\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.618522 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.618373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa23b615-dc8a-4c69-9454-d85ee27b097b-service-ca\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.618522 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.618408 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa23b615-dc8a-4c69-9454-d85ee27b097b-console-serving-cert\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.618522 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.618432 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa23b615-dc8a-4c69-9454-d85ee27b097b-trusted-ca-bundle\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.618522 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.618495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k7n8\" (UniqueName: \"kubernetes.io/projected/fa23b615-dc8a-4c69-9454-d85ee27b097b-kube-api-access-7k7n8\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.618762 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.618561 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa23b615-dc8a-4c69-9454-d85ee27b097b-oauth-serving-cert\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.618762 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.618627 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa23b615-dc8a-4c69-9454-d85ee27b097b-console-oauth-config\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.719718 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.719681 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa23b615-dc8a-4c69-9454-d85ee27b097b-console-config\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.719900 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.719734 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa23b615-dc8a-4c69-9454-d85ee27b097b-service-ca\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.719900 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.719751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa23b615-dc8a-4c69-9454-d85ee27b097b-console-serving-cert\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.719900 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.719770 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa23b615-dc8a-4c69-9454-d85ee27b097b-trusted-ca-bundle\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.719900 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.719793 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k7n8\" (UniqueName: \"kubernetes.io/projected/fa23b615-dc8a-4c69-9454-d85ee27b097b-kube-api-access-7k7n8\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.719900 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.719842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa23b615-dc8a-4c69-9454-d85ee27b097b-oauth-serving-cert\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.720190 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.719904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa23b615-dc8a-4c69-9454-d85ee27b097b-console-oauth-config\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.720584 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.720552 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa23b615-dc8a-4c69-9454-d85ee27b097b-service-ca\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.720701 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.720608 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa23b615-dc8a-4c69-9454-d85ee27b097b-oauth-serving-cert\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.720766 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.720744 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa23b615-dc8a-4c69-9454-d85ee27b097b-console-config\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.720806 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.720756 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa23b615-dc8a-4c69-9454-d85ee27b097b-trusted-ca-bundle\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.722901 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.722871 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa23b615-dc8a-4c69-9454-d85ee27b097b-console-oauth-config\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.722982 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.722881 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa23b615-dc8a-4c69-9454-d85ee27b097b-console-serving-cert\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.727855 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.727835 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k7n8\" (UniqueName: \"kubernetes.io/projected/fa23b615-dc8a-4c69-9454-d85ee27b097b-kube-api-access-7k7n8\") pod \"console-75f64d5d79-lg9d9\" (UID: \"fa23b615-dc8a-4c69-9454-d85ee27b097b\") " pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.838800 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.838762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:32.954009 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:32.953986 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75f64d5d79-lg9d9"] Apr 16 16:54:32.955993 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:54:32.955967 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa23b615_dc8a_4c69_9454_d85ee27b097b.slice/crio-fb2f56f090e68c5e994c1bb9625aad9e6a50cd63c8662135ec840e3d75d84e38 WatchSource:0}: Error finding container fb2f56f090e68c5e994c1bb9625aad9e6a50cd63c8662135ec840e3d75d84e38: Status 404 returned error can't find the container with id fb2f56f090e68c5e994c1bb9625aad9e6a50cd63c8662135ec840e3d75d84e38 Apr 16 16:54:33.480610 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:33.480580 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-cwpbt" Apr 16 16:54:33.507523 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:33.507491 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75f64d5d79-lg9d9" event={"ID":"fa23b615-dc8a-4c69-9454-d85ee27b097b","Type":"ContainerStarted","Data":"a54423200bb33cbabacc3038246436e67300b67fbe3c71f4e751f2ad9ca883a9"} Apr 16 16:54:33.507523 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:33.507526 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75f64d5d79-lg9d9" event={"ID":"fa23b615-dc8a-4c69-9454-d85ee27b097b","Type":"ContainerStarted","Data":"fb2f56f090e68c5e994c1bb9625aad9e6a50cd63c8662135ec840e3d75d84e38"} Apr 16 16:54:33.526698 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:33.526638 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75f64d5d79-lg9d9" podStartSLOduration=1.526623624 podStartE2EDuration="1.526623624s" podCreationTimestamp="2026-04-16 16:54:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:54:33.524717089 +0000 UTC m=+355.807742672" watchObservedRunningTime="2026-04-16 16:54:33.526623624 +0000 UTC m=+355.809649201" Apr 16 16:54:42.839721 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:42.839685 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:42.839721 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:42.839725 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:42.844831 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:42.844805 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:54:43.539963 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:54:43.539933 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75f64d5d79-lg9d9" Apr 16 16:55:12.744407 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:12.744372 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-5ndg6"] Apr 16 16:55:12.747789 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:12.747771 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" Apr 16 16:55:12.750678 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:12.750643 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-rzzrl\"" Apr 16 16:55:12.750799 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:12.750645 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 16:55:12.754893 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:12.754702 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-5ndg6"] Apr 16 16:55:12.772590 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:12.772561 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-289nx\" (UniqueName: \"kubernetes.io/projected/a3acac4b-3788-42e7-80b6-f911493710b0-kube-api-access-289nx\") pod \"limitador-limitador-64c8f475fb-5ndg6\" (UID: \"a3acac4b-3788-42e7-80b6-f911493710b0\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" Apr 16 16:55:12.772711 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:12.772594 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a3acac4b-3788-42e7-80b6-f911493710b0-config-file\") pod \"limitador-limitador-64c8f475fb-5ndg6\" (UID: \"a3acac4b-3788-42e7-80b6-f911493710b0\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" Apr 16 16:55:12.834250 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:12.834222 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-5ndg6"] Apr 16 16:55:12.873429 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:12.873399 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-289nx\" (UniqueName: \"kubernetes.io/projected/a3acac4b-3788-42e7-80b6-f911493710b0-kube-api-access-289nx\") pod \"limitador-limitador-64c8f475fb-5ndg6\" (UID: \"a3acac4b-3788-42e7-80b6-f911493710b0\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" Apr 16 16:55:12.873590 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:12.873434 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a3acac4b-3788-42e7-80b6-f911493710b0-config-file\") pod \"limitador-limitador-64c8f475fb-5ndg6\" (UID: \"a3acac4b-3788-42e7-80b6-f911493710b0\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" Apr 16 16:55:12.874006 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:12.873988 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a3acac4b-3788-42e7-80b6-f911493710b0-config-file\") pod \"limitador-limitador-64c8f475fb-5ndg6\" (UID: \"a3acac4b-3788-42e7-80b6-f911493710b0\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" Apr 16 16:55:12.882524 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:12.882495 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-289nx\" (UniqueName: \"kubernetes.io/projected/a3acac4b-3788-42e7-80b6-f911493710b0-kube-api-access-289nx\") pod \"limitador-limitador-64c8f475fb-5ndg6\" (UID: \"a3acac4b-3788-42e7-80b6-f911493710b0\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" Apr 16 16:55:13.058506 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:13.058479 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" Apr 16 16:55:13.178440 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:13.178419 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-5ndg6"] Apr 16 16:55:13.180938 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:55:13.180911 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3acac4b_3788_42e7_80b6_f911493710b0.slice/crio-b3227638f89113acff281c9a76fde9f6ba4ca9c1b25ed5851cba92cfce2ba3f6 WatchSource:0}: Error finding container b3227638f89113acff281c9a76fde9f6ba4ca9c1b25ed5851cba92cfce2ba3f6: Status 404 returned error can't find the container with id b3227638f89113acff281c9a76fde9f6ba4ca9c1b25ed5851cba92cfce2ba3f6 Apr 16 16:55:13.627717 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:13.627681 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" event={"ID":"a3acac4b-3788-42e7-80b6-f911493710b0","Type":"ContainerStarted","Data":"b3227638f89113acff281c9a76fde9f6ba4ca9c1b25ed5851cba92cfce2ba3f6"} Apr 16 16:55:17.644213 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:17.644178 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" event={"ID":"a3acac4b-3788-42e7-80b6-f911493710b0","Type":"ContainerStarted","Data":"86ad47adcdd79963b93c538f553560a62ea492ede9cb318aff6f37361bdd4eb4"} Apr 16 16:55:17.644587 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:17.644236 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" Apr 16 16:55:17.662759 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:17.662710 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" podStartSLOduration=2.029640547 podStartE2EDuration="5.662696195s" podCreationTimestamp="2026-04-16 16:55:12 +0000 UTC" firstStartedPulling="2026-04-16 16:55:13.182853763 +0000 UTC m=+395.465879324" lastFinishedPulling="2026-04-16 16:55:16.815909417 +0000 UTC m=+399.098934972" observedRunningTime="2026-04-16 16:55:17.660303684 +0000 UTC m=+399.943329282" watchObservedRunningTime="2026-04-16 16:55:17.662696195 +0000 UTC m=+399.945721771" Apr 16 16:55:28.648708 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:28.648678 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" Apr 16 16:55:37.285148 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:37.285113 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-5ndg6"] Apr 16 16:55:37.285518 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:37.285351 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" podUID="a3acac4b-3788-42e7-80b6-f911493710b0" containerName="limitador" containerID="cri-o://86ad47adcdd79963b93c538f553560a62ea492ede9cb318aff6f37361bdd4eb4" gracePeriod=30 Apr 16 16:55:37.715290 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:37.715258 2574 generic.go:358] "Generic (PLEG): container finished" podID="a3acac4b-3788-42e7-80b6-f911493710b0" containerID="86ad47adcdd79963b93c538f553560a62ea492ede9cb318aff6f37361bdd4eb4" exitCode=0 Apr 16 16:55:37.715459 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:37.715334 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" event={"ID":"a3acac4b-3788-42e7-80b6-f911493710b0","Type":"ContainerDied","Data":"86ad47adcdd79963b93c538f553560a62ea492ede9cb318aff6f37361bdd4eb4"} Apr 16 16:55:38.221907 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:38.221886 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" Apr 16 16:55:38.301093 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:38.301049 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-289nx\" (UniqueName: \"kubernetes.io/projected/a3acac4b-3788-42e7-80b6-f911493710b0-kube-api-access-289nx\") pod \"a3acac4b-3788-42e7-80b6-f911493710b0\" (UID: \"a3acac4b-3788-42e7-80b6-f911493710b0\") " Apr 16 16:55:38.301513 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:38.301139 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a3acac4b-3788-42e7-80b6-f911493710b0-config-file\") pod \"a3acac4b-3788-42e7-80b6-f911493710b0\" (UID: \"a3acac4b-3788-42e7-80b6-f911493710b0\") " Apr 16 16:55:38.301513 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:38.301454 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3acac4b-3788-42e7-80b6-f911493710b0-config-file" (OuterVolumeSpecName: "config-file") pod "a3acac4b-3788-42e7-80b6-f911493710b0" (UID: "a3acac4b-3788-42e7-80b6-f911493710b0"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:55:38.303041 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:38.303021 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3acac4b-3788-42e7-80b6-f911493710b0-kube-api-access-289nx" (OuterVolumeSpecName: "kube-api-access-289nx") pod "a3acac4b-3788-42e7-80b6-f911493710b0" (UID: "a3acac4b-3788-42e7-80b6-f911493710b0"). InnerVolumeSpecName "kube-api-access-289nx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:55:38.402619 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:38.402535 2574 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a3acac4b-3788-42e7-80b6-f911493710b0-config-file\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:55:38.402619 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:38.402570 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-289nx\" (UniqueName: \"kubernetes.io/projected/a3acac4b-3788-42e7-80b6-f911493710b0-kube-api-access-289nx\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 16:55:38.719719 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:38.719625 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" event={"ID":"a3acac4b-3788-42e7-80b6-f911493710b0","Type":"ContainerDied","Data":"b3227638f89113acff281c9a76fde9f6ba4ca9c1b25ed5851cba92cfce2ba3f6"} Apr 16 16:55:38.719719 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:38.719694 2574 scope.go:117] "RemoveContainer" containerID="86ad47adcdd79963b93c538f553560a62ea492ede9cb318aff6f37361bdd4eb4" Apr 16 16:55:38.719922 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:38.719635 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-5ndg6" Apr 16 16:55:38.741235 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:38.741211 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-5ndg6"] Apr 16 16:55:38.746966 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:38.746946 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-5ndg6"] Apr 16 16:55:40.274471 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:40.274437 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3acac4b-3788-42e7-80b6-f911493710b0" path="/var/lib/kubelet/pods/a3acac4b-3788-42e7-80b6-f911493710b0/volumes" Apr 16 16:55:56.566162 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.566133 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc"] Apr 16 16:55:56.566594 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.566458 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3acac4b-3788-42e7-80b6-f911493710b0" containerName="limitador" Apr 16 16:55:56.566594 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.566469 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3acac4b-3788-42e7-80b6-f911493710b0" containerName="limitador" Apr 16 16:55:56.566594 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.566551 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3acac4b-3788-42e7-80b6-f911493710b0" containerName="limitador" Apr 16 16:55:56.584195 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.584163 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc"] Apr 16 16:55:56.584340 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.584275 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.587400 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.587378 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 16:55:56.587550 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.587508 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 16:55:56.587550 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.587533 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 16:55:56.587686 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.587592 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-2mwsx\"" Apr 16 16:55:56.587736 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.587685 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 16:55:56.588878 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.588854 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 16:55:56.588988 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.588887 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 16:55:56.660460 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.660428 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.660624 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.660495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.660624 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.660526 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.660624 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.660556 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.660785 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.660634 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.660785 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.660745 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clzlp\" (UniqueName: \"kubernetes.io/projected/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-kube-api-access-clzlp\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.660857 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.660789 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.761787 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.761751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clzlp\" (UniqueName: \"kubernetes.io/projected/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-kube-api-access-clzlp\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.761961 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.761795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.761961 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.761827 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.761961 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.761864 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.761961 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.761883 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.761961 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.761900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.762224 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.762048 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.762618 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.762590 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.764331 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.764302 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.764454 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.764394 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.764531 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.764515 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.764646 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.764629 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.774869 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.774842 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clzlp\" (UniqueName: \"kubernetes.io/projected/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-kube-api-access-clzlp\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.784527 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.784505 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e9dbcc4d-23f9-4e47-bde8-9d690265fbfd-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-tvvcc\" (UID: \"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:56.894159 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:56.894080 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:57.039946 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:57.039922 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc"] Apr 16 16:55:57.041940 ip-10-0-143-10 kubenswrapper[2574]: W0416 16:55:57.041915 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9dbcc4d_23f9_4e47_bde8_9d690265fbfd.slice/crio-82b70b8e3547ee0000cf5925dc9bb3b1331abe44b7acc773901e4757cb9f4369 WatchSource:0}: Error finding container 82b70b8e3547ee0000cf5925dc9bb3b1331abe44b7acc773901e4757cb9f4369: Status 404 returned error can't find the container with id 82b70b8e3547ee0000cf5925dc9bb3b1331abe44b7acc773901e4757cb9f4369 Apr 16 16:55:57.782612 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:57.782576 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" event={"ID":"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd","Type":"ContainerStarted","Data":"82b70b8e3547ee0000cf5925dc9bb3b1331abe44b7acc773901e4757cb9f4369"} Apr 16 16:55:59.532613 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:59.532565 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 16:55:59.532858 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:59.532668 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 16:55:59.790508 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:59.790422 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" event={"ID":"e9dbcc4d-23f9-4e47-bde8-9d690265fbfd","Type":"ContainerStarted","Data":"d92d6d94ee7d924c1c26593b95742c3a511ec88fb3d810506a836a371cbdfb53"} Apr 16 16:55:59.790691 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:59.790547 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:55:59.879504 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:55:59.879450 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" podStartSLOduration=1.391012065 podStartE2EDuration="3.879432476s" podCreationTimestamp="2026-04-16 16:55:56 +0000 UTC" firstStartedPulling="2026-04-16 16:55:57.043888684 +0000 UTC m=+439.326914238" lastFinishedPulling="2026-04-16 16:55:59.532309094 +0000 UTC m=+441.815334649" observedRunningTime="2026-04-16 16:55:59.879058429 +0000 UTC m=+442.162084005" watchObservedRunningTime="2026-04-16 16:55:59.879432476 +0000 UTC m=+442.162458053" Apr 16 16:56:00.795966 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:56:00.795936 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tvvcc" Apr 16 16:58:38.178426 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:58:38.178388 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 16:58:38.179645 ip-10-0-143-10 kubenswrapper[2574]: I0416 16:58:38.179623 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 17:03:38.202305 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:03:38.202231 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 17:03:38.204099 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:03:38.204078 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 17:05:45.017979 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.017947 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-7b5bf88c88-bd28w"] Apr 16 17:05:45.021129 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.021112 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7b5bf88c88-bd28w" Apr 16 17:05:45.025560 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.025534 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-x6k2x\"" Apr 16 17:05:45.025560 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.025546 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 17:05:45.025762 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.025577 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 17:05:45.025762 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.025577 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 17:05:45.033310 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.033284 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7b5bf88c88-bd28w"] Apr 16 17:05:45.190339 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.190298 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsxgm\" (UniqueName: \"kubernetes.io/projected/6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59-kube-api-access-gsxgm\") pod \"llmisvc-controller-manager-7b5bf88c88-bd28w\" (UID: \"6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59\") " pod="kserve/llmisvc-controller-manager-7b5bf88c88-bd28w" Apr 16 17:05:45.190497 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.190401 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59-cert\") pod \"llmisvc-controller-manager-7b5bf88c88-bd28w\" (UID: \"6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59\") " pod="kserve/llmisvc-controller-manager-7b5bf88c88-bd28w" Apr 16 17:05:45.291066 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.290975 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59-cert\") pod \"llmisvc-controller-manager-7b5bf88c88-bd28w\" (UID: \"6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59\") " pod="kserve/llmisvc-controller-manager-7b5bf88c88-bd28w" Apr 16 17:05:45.291243 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.291142 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsxgm\" (UniqueName: \"kubernetes.io/projected/6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59-kube-api-access-gsxgm\") pod \"llmisvc-controller-manager-7b5bf88c88-bd28w\" (UID: \"6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59\") " pod="kserve/llmisvc-controller-manager-7b5bf88c88-bd28w" Apr 16 17:05:45.293415 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.293391 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59-cert\") pod \"llmisvc-controller-manager-7b5bf88c88-bd28w\" (UID: \"6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59\") " pod="kserve/llmisvc-controller-manager-7b5bf88c88-bd28w" Apr 16 17:05:45.299719 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.299698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsxgm\" (UniqueName: \"kubernetes.io/projected/6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59-kube-api-access-gsxgm\") pod \"llmisvc-controller-manager-7b5bf88c88-bd28w\" (UID: \"6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59\") " pod="kserve/llmisvc-controller-manager-7b5bf88c88-bd28w" Apr 16 17:05:45.337370 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.337332 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7b5bf88c88-bd28w" Apr 16 17:05:45.453432 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.453408 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7b5bf88c88-bd28w"] Apr 16 17:05:45.455130 ip-10-0-143-10 kubenswrapper[2574]: W0416 17:05:45.455091 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6a3f98cd_72b0_4b09_9c9a_d4c789d1fe59.slice/crio-1016ab0c9d8f1dc8b5777668bd6bc1ff8c117c8aa3ecf7afc98bb56d13f7530c WatchSource:0}: Error finding container 1016ab0c9d8f1dc8b5777668bd6bc1ff8c117c8aa3ecf7afc98bb56d13f7530c: Status 404 returned error can't find the container with id 1016ab0c9d8f1dc8b5777668bd6bc1ff8c117c8aa3ecf7afc98bb56d13f7530c Apr 16 17:05:45.456433 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.456414 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:05:45.723040 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:45.723004 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7b5bf88c88-bd28w" event={"ID":"6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59","Type":"ContainerStarted","Data":"1016ab0c9d8f1dc8b5777668bd6bc1ff8c117c8aa3ecf7afc98bb56d13f7530c"} Apr 16 17:05:49.739498 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:49.739465 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7b5bf88c88-bd28w" event={"ID":"6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59","Type":"ContainerStarted","Data":"77ff6f9ed00c24601c0e37f6fcea38b646ed6209249eaa967be064b01141b54c"} Apr 16 17:05:49.739984 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:49.739552 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-7b5bf88c88-bd28w" Apr 16 17:05:49.757248 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:05:49.757203 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-7b5bf88c88-bd28w" podStartSLOduration=1.347998517 podStartE2EDuration="4.75719207s" podCreationTimestamp="2026-04-16 17:05:45 +0000 UTC" firstStartedPulling="2026-04-16 17:05:45.4565369 +0000 UTC m=+1027.739562455" lastFinishedPulling="2026-04-16 17:05:48.865730449 +0000 UTC m=+1031.148756008" observedRunningTime="2026-04-16 17:05:49.756305449 +0000 UTC m=+1032.039331027" watchObservedRunningTime="2026-04-16 17:05:49.75719207 +0000 UTC m=+1032.040217672" Apr 16 17:06:20.744515 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:06:20.744486 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-7b5bf88c88-bd28w" Apr 16 17:08:38.230447 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:08:38.230419 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 17:08:38.232148 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:08:38.232125 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 17:13:24.984090 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:24.984055 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt"] Apr 16 17:13:24.987700 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:24.987679 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:24.990693 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:24.990670 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 17:13:24.990693 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:24.990688 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-z2vnv\"" Apr 16 17:13:24.990871 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:24.990671 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 17:13:24.990871 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:24.990707 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 17:13:24.998808 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:24.998788 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt"] Apr 16 17:13:25.087784 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.087747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.087992 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.087790 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e0639f2f-1392-44f1-8a19-0675af572a08-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.087992 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.087911 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.087992 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.087958 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e0639f2f-1392-44f1-8a19-0675af572a08-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.087992 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.087984 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzq74\" (UniqueName: \"kubernetes.io/projected/e0639f2f-1392-44f1-8a19-0675af572a08-kube-api-access-mzq74\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.088182 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.088034 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e0639f2f-1392-44f1-8a19-0675af572a08-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.088182 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.088078 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.088182 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.088133 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.088182 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.088175 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.188783 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.188748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.188783 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.188791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e0639f2f-1392-44f1-8a19-0675af572a08-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.189028 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.188808 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzq74\" (UniqueName: \"kubernetes.io/projected/e0639f2f-1392-44f1-8a19-0675af572a08-kube-api-access-mzq74\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.189028 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.188835 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e0639f2f-1392-44f1-8a19-0675af572a08-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.189028 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.188862 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.189028 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.188885 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.189028 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.188916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.189028 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.188949 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.189028 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.188973 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e0639f2f-1392-44f1-8a19-0675af572a08-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.189354 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.189215 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.189354 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.189298 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.189465 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.189436 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.189528 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.189506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.189690 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.189647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e0639f2f-1392-44f1-8a19-0675af572a08-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.191372 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.191337 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e0639f2f-1392-44f1-8a19-0675af572a08-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.191545 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.191523 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e0639f2f-1392-44f1-8a19-0675af572a08-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.196985 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.196959 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e0639f2f-1392-44f1-8a19-0675af572a08-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.197077 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.196994 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzq74\" (UniqueName: \"kubernetes.io/projected/e0639f2f-1392-44f1-8a19-0675af572a08-kube-api-access-mzq74\") pod \"router-gateway-2-openshift-default-6866b85949-lflmt\" (UID: \"e0639f2f-1392-44f1-8a19-0675af572a08\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.298595 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.298526 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:25.417235 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.417212 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt"] Apr 16 17:13:25.419707 ip-10-0-143-10 kubenswrapper[2574]: W0416 17:13:25.419678 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0639f2f_1392_44f1_8a19_0675af572a08.slice/crio-1d700437ad51df9bf03206eb893f7b74f87aaf2cf93ff7e68b0814a6825bf1cd WatchSource:0}: Error finding container 1d700437ad51df9bf03206eb893f7b74f87aaf2cf93ff7e68b0814a6825bf1cd: Status 404 returned error can't find the container with id 1d700437ad51df9bf03206eb893f7b74f87aaf2cf93ff7e68b0814a6825bf1cd Apr 16 17:13:25.421863 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:25.421847 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:13:26.259135 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:26.259101 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" event={"ID":"e0639f2f-1392-44f1-8a19-0675af572a08","Type":"ContainerStarted","Data":"1d700437ad51df9bf03206eb893f7b74f87aaf2cf93ff7e68b0814a6825bf1cd"} Apr 16 17:13:34.879867 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:34.879830 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 17:13:34.880152 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:34.879904 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 17:13:34.880152 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:34.879934 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 17:13:35.290558 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:35.290479 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" event={"ID":"e0639f2f-1392-44f1-8a19-0675af572a08","Type":"ContainerStarted","Data":"edaf8151bf758992b0d0e03a38429e03f2c85382803244b731e3334564f68a31"} Apr 16 17:13:35.298883 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:35.298864 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:35.299941 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:35.299899 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" podUID="e0639f2f-1392-44f1-8a19-0675af572a08" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.29:15021/healthz/ready\": dial tcp 10.134.0.29:15021: connect: connection refused" Apr 16 17:13:35.312026 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:35.311978 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" podStartSLOduration=1.8544202269999999 podStartE2EDuration="11.311961897s" podCreationTimestamp="2026-04-16 17:13:24 +0000 UTC" firstStartedPulling="2026-04-16 17:13:25.421974106 +0000 UTC m=+1487.704999660" lastFinishedPulling="2026-04-16 17:13:34.879515775 +0000 UTC m=+1497.162541330" observedRunningTime="2026-04-16 17:13:35.309544198 +0000 UTC m=+1497.592569801" watchObservedRunningTime="2026-04-16 17:13:35.311961897 +0000 UTC m=+1497.594987478" Apr 16 17:13:36.299900 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:36.299864 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" podUID="e0639f2f-1392-44f1-8a19-0675af572a08" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.29:15021/healthz/ready\": dial tcp 10.134.0.29:15021: connect: connection refused" Apr 16 17:13:37.299332 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:37.299296 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" podUID="e0639f2f-1392-44f1-8a19-0675af572a08" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.29:15021/healthz/ready\": dial tcp 10.134.0.29:15021: connect: connection refused" Apr 16 17:13:38.262408 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:38.262376 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 17:13:38.266602 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:38.266580 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 17:13:38.302998 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:38.302971 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:38.303256 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:38.303235 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:38.304030 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:38.304011 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-lflmt" Apr 16 17:13:43.766405 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.766371 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq"] Apr 16 17:13:43.771277 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.771258 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.776029 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.776001 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-g4psd\"" Apr 16 17:13:43.776126 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.776043 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 17:13:43.780018 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.779998 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq"] Apr 16 17:13:43.848507 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.848471 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-dshm\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.848507 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.848502 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.848755 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.848560 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-home\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.848755 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.848596 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdfvd\" (UniqueName: \"kubernetes.io/projected/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-kube-api-access-qdfvd\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.848755 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.848715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-model-cache\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.848755 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.848749 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-tls-certs\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.949434 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.949398 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdfvd\" (UniqueName: \"kubernetes.io/projected/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-kube-api-access-qdfvd\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.949612 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.949452 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-model-cache\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.949612 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.949483 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-tls-certs\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.949612 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.949513 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-dshm\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.949612 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.949530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.949612 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.949585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-home\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.949979 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.949955 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-model-cache\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.949979 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.949970 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.950092 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.950021 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-home\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.951949 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.951929 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-dshm\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.952189 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.952154 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-tls-certs\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:43.958334 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:43.958308 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdfvd\" (UniqueName: \"kubernetes.io/projected/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-kube-api-access-qdfvd\") pod \"scheduler-inline-config-test-kserve-587498644-znkwq\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:44.082836 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:44.082805 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:13:44.204751 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:44.204561 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq"] Apr 16 17:13:44.208620 ip-10-0-143-10 kubenswrapper[2574]: W0416 17:13:44.208592 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0f2b6f3_4126_492f_bb7f_b5c4680f7d6b.slice/crio-13257a79153a064b9ab4162894b36258251a6e24c28a0fe6fa06d4859b0d1876 WatchSource:0}: Error finding container 13257a79153a064b9ab4162894b36258251a6e24c28a0fe6fa06d4859b0d1876: Status 404 returned error can't find the container with id 13257a79153a064b9ab4162894b36258251a6e24c28a0fe6fa06d4859b0d1876 Apr 16 17:13:44.319479 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:44.319447 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" event={"ID":"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b","Type":"ContainerStarted","Data":"13257a79153a064b9ab4162894b36258251a6e24c28a0fe6fa06d4859b0d1876"} Apr 16 17:13:49.338943 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:49.338907 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" event={"ID":"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b","Type":"ContainerStarted","Data":"1c00dab863f8d5fc1f82580ae341ad43f8d7e77e024469fed8d1dda0f97b2317"} Apr 16 17:13:53.354254 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:53.354223 2574 generic.go:358] "Generic (PLEG): container finished" podID="b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" containerID="1c00dab863f8d5fc1f82580ae341ad43f8d7e77e024469fed8d1dda0f97b2317" exitCode=0 Apr 16 17:13:53.354254 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:53.354257 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" event={"ID":"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b","Type":"ContainerDied","Data":"1c00dab863f8d5fc1f82580ae341ad43f8d7e77e024469fed8d1dda0f97b2317"} Apr 16 17:13:55.362357 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:55.362320 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" event={"ID":"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b","Type":"ContainerStarted","Data":"3bab5b68a3e05b5b11775e74be3f9446ded2645d24a49929476e7f991414213e"} Apr 16 17:13:55.379915 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:13:55.379859 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" podStartSLOduration=2.022797145 podStartE2EDuration="12.379844302s" podCreationTimestamp="2026-04-16 17:13:43 +0000 UTC" firstStartedPulling="2026-04-16 17:13:44.21043689 +0000 UTC m=+1506.493462446" lastFinishedPulling="2026-04-16 17:13:54.567484045 +0000 UTC m=+1516.850509603" observedRunningTime="2026-04-16 17:13:55.379043396 +0000 UTC m=+1517.662068974" watchObservedRunningTime="2026-04-16 17:13:55.379844302 +0000 UTC m=+1517.662869882" Apr 16 17:14:04.083626 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:04.083592 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:14:04.084121 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:04.083760 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:14:04.096029 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:04.096006 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:14:04.402444 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:04.402370 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:14:18.157843 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.157807 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq"] Apr 16 17:14:18.158311 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.158101 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" podUID="b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" containerName="main" containerID="cri-o://3bab5b68a3e05b5b11775e74be3f9446ded2645d24a49929476e7f991414213e" gracePeriod=30 Apr 16 17:14:18.414190 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.414133 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:14:18.442996 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.442969 2574 generic.go:358] "Generic (PLEG): container finished" podID="b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" containerID="3bab5b68a3e05b5b11775e74be3f9446ded2645d24a49929476e7f991414213e" exitCode=0 Apr 16 17:14:18.443145 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.443042 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" Apr 16 17:14:18.443145 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.443042 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" event={"ID":"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b","Type":"ContainerDied","Data":"3bab5b68a3e05b5b11775e74be3f9446ded2645d24a49929476e7f991414213e"} Apr 16 17:14:18.443145 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.443084 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq" event={"ID":"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b","Type":"ContainerDied","Data":"13257a79153a064b9ab4162894b36258251a6e24c28a0fe6fa06d4859b0d1876"} Apr 16 17:14:18.443145 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.443103 2574 scope.go:117] "RemoveContainer" containerID="3bab5b68a3e05b5b11775e74be3f9446ded2645d24a49929476e7f991414213e" Apr 16 17:14:18.453068 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.453054 2574 scope.go:117] "RemoveContainer" containerID="1c00dab863f8d5fc1f82580ae341ad43f8d7e77e024469fed8d1dda0f97b2317" Apr 16 17:14:18.465718 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.465700 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-home\") pod \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " Apr 16 17:14:18.465819 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.465766 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-dshm\") pod \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " Apr 16 17:14:18.465819 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.465806 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-kserve-provision-location\") pod \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " Apr 16 17:14:18.465938 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.465865 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdfvd\" (UniqueName: \"kubernetes.io/projected/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-kube-api-access-qdfvd\") pod \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " Apr 16 17:14:18.465938 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.465892 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-model-cache\") pod \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " Apr 16 17:14:18.465938 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.465920 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-tls-certs\") pod \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\" (UID: \"b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b\") " Apr 16 17:14:18.466081 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.465966 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-home" (OuterVolumeSpecName: "home") pod "b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" (UID: "b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:14:18.466170 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.466138 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-model-cache" (OuterVolumeSpecName: "model-cache") pod "b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" (UID: "b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:14:18.466264 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.466239 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-model-cache\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 17:14:18.466264 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.466262 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-home\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 17:14:18.467870 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.467845 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-dshm" (OuterVolumeSpecName: "dshm") pod "b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" (UID: "b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:14:18.468016 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.467997 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-kube-api-access-qdfvd" (OuterVolumeSpecName: "kube-api-access-qdfvd") pod "b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" (UID: "b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b"). InnerVolumeSpecName "kube-api-access-qdfvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:14:18.468284 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.468267 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" (UID: "b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:14:18.520795 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.520775 2574 scope.go:117] "RemoveContainer" containerID="3bab5b68a3e05b5b11775e74be3f9446ded2645d24a49929476e7f991414213e" Apr 16 17:14:18.521053 ip-10-0-143-10 kubenswrapper[2574]: E0416 17:14:18.521036 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bab5b68a3e05b5b11775e74be3f9446ded2645d24a49929476e7f991414213e\": container with ID starting with 3bab5b68a3e05b5b11775e74be3f9446ded2645d24a49929476e7f991414213e not found: ID does not exist" containerID="3bab5b68a3e05b5b11775e74be3f9446ded2645d24a49929476e7f991414213e" Apr 16 17:14:18.521099 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.521062 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bab5b68a3e05b5b11775e74be3f9446ded2645d24a49929476e7f991414213e"} err="failed to get container status \"3bab5b68a3e05b5b11775e74be3f9446ded2645d24a49929476e7f991414213e\": rpc error: code = NotFound desc = could not find container \"3bab5b68a3e05b5b11775e74be3f9446ded2645d24a49929476e7f991414213e\": container with ID starting with 3bab5b68a3e05b5b11775e74be3f9446ded2645d24a49929476e7f991414213e not found: ID does not exist" Apr 16 17:14:18.521099 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.521081 2574 scope.go:117] "RemoveContainer" containerID="1c00dab863f8d5fc1f82580ae341ad43f8d7e77e024469fed8d1dda0f97b2317" Apr 16 17:14:18.521301 ip-10-0-143-10 kubenswrapper[2574]: E0416 17:14:18.521283 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c00dab863f8d5fc1f82580ae341ad43f8d7e77e024469fed8d1dda0f97b2317\": container with ID starting with 1c00dab863f8d5fc1f82580ae341ad43f8d7e77e024469fed8d1dda0f97b2317 not found: ID does not exist" containerID="1c00dab863f8d5fc1f82580ae341ad43f8d7e77e024469fed8d1dda0f97b2317" Apr 16 17:14:18.521348 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.521308 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c00dab863f8d5fc1f82580ae341ad43f8d7e77e024469fed8d1dda0f97b2317"} err="failed to get container status \"1c00dab863f8d5fc1f82580ae341ad43f8d7e77e024469fed8d1dda0f97b2317\": rpc error: code = NotFound desc = could not find container \"1c00dab863f8d5fc1f82580ae341ad43f8d7e77e024469fed8d1dda0f97b2317\": container with ID starting with 1c00dab863f8d5fc1f82580ae341ad43f8d7e77e024469fed8d1dda0f97b2317 not found: ID does not exist" Apr 16 17:14:18.530243 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.530219 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" (UID: "b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:14:18.567427 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.567406 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qdfvd\" (UniqueName: \"kubernetes.io/projected/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-kube-api-access-qdfvd\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 17:14:18.567427 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.567426 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-tls-certs\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 17:14:18.567532 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.567437 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-dshm\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 17:14:18.567532 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.567446 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b-kserve-provision-location\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 16 17:14:18.765186 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.765163 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq"] Apr 16 17:14:18.768803 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:18.768769 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-587498644-znkwq"] Apr 16 17:14:20.274567 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:14:20.274527 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" path="/var/lib/kubelet/pods/b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b/volumes" Apr 16 17:15:39.588755 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:39.588668 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:40.614086 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:40.614063 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:41.630793 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:41.630763 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:42.611798 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:42.611770 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:43.586627 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:43.586597 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:44.549280 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:44.549248 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:45.542612 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:45.542580 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:46.521924 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:46.521890 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:47.552916 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:47.552885 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:48.573920 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:48.573897 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:49.565264 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:49.565239 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:50.550343 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:50.550315 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:51.533040 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:51.533012 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:52.514296 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:52.514266 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-lflmt_e0639f2f-1392-44f1-8a19-0675af572a08/istio-proxy/0.log" Apr 16 17:15:53.504966 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:53.504937 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-tvvcc_e9dbcc4d-23f9-4e47-bde8-9d690265fbfd/discovery/0.log" Apr 16 17:15:54.323219 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:54.323191 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-tvvcc_e9dbcc4d-23f9-4e47-bde8-9d690265fbfd/discovery/0.log" Apr 16 17:15:55.245670 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:15:55.245635 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-cwpbt_f01a3ef9-1e51-4dec-bc39-b24e1f17f018/manager/0.log" Apr 16 17:16:00.398672 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:00.398627 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-f8hkt_68f9da11-93da-44fa-ac64-d1075ee67259/global-pull-secret-syncer/0.log" Apr 16 17:16:00.482439 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:00.482414 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-89n6j_ef20d3fb-2f3a-4af7-97e4-df2e2773f314/konnectivity-agent/0.log" Apr 16 17:16:00.582827 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:00.582796 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-10.ec2.internal_f1bc04e409baab07763d6ca236ceaf1a/haproxy/0.log" Apr 16 17:16:04.754727 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:04.754641 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-cwpbt_f01a3ef9-1e51-4dec-bc39-b24e1f17f018/manager/0.log" Apr 16 17:16:05.661645 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.661616 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7/alertmanager/0.log" Apr 16 17:16:05.682411 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.682383 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7/config-reloader/0.log" Apr 16 17:16:05.701743 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.701724 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7/kube-rbac-proxy-web/0.log" Apr 16 17:16:05.721264 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.721241 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7/kube-rbac-proxy/0.log" Apr 16 17:16:05.743369 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.743303 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7/kube-rbac-proxy-metric/0.log" Apr 16 17:16:05.762949 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.762930 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7/prom-label-proxy/0.log" Apr 16 17:16:05.784302 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.784279 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bc5dcfd6-23d2-4b36-8e6f-37b73cc6dcf7/init-config-reloader/0.log" Apr 16 17:16:05.850048 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.850022 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-4db9b_6484e384-9c84-470e-869d-55dd04279464/kube-state-metrics/0.log" Apr 16 17:16:05.867798 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.867774 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-4db9b_6484e384-9c84-470e-869d-55dd04279464/kube-rbac-proxy-main/0.log" Apr 16 17:16:05.886873 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.886851 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-4db9b_6484e384-9c84-470e-869d-55dd04279464/kube-rbac-proxy-self/0.log" Apr 16 17:16:05.909019 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.908998 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-79468b9bf5-qh26g_ba3481af-decf-4c56-b396-527c1e009a30/metrics-server/0.log" Apr 16 17:16:05.930873 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.930851 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-j58zk_2224a16f-0022-488d-a462-4d5ab10cb7bb/monitoring-plugin/0.log" Apr 16 17:16:05.965258 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.965235 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7zrbv_6db3b019-e54c-4966-83dd-cc7619e3f221/node-exporter/0.log" Apr 16 17:16:05.984820 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:05.984798 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7zrbv_6db3b019-e54c-4966-83dd-cc7619e3f221/kube-rbac-proxy/0.log" Apr 16 17:16:06.004250 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.004199 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7zrbv_6db3b019-e54c-4966-83dd-cc7619e3f221/init-textfile/0.log" Apr 16 17:16:06.164634 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.164605 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-z9p6c_fcc69899-6610-4ce7-803b-eaaaaa23fab8/kube-rbac-proxy-main/0.log" Apr 16 17:16:06.184261 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.184237 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-z9p6c_fcc69899-6610-4ce7-803b-eaaaaa23fab8/kube-rbac-proxy-self/0.log" Apr 16 17:16:06.206099 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.206077 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-z9p6c_fcc69899-6610-4ce7-803b-eaaaaa23fab8/openshift-state-metrics/0.log" Apr 16 17:16:06.244270 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.244246 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8eacc20-f29a-4f10-81f6-b883b281b28e/prometheus/0.log" Apr 16 17:16:06.262216 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.262198 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8eacc20-f29a-4f10-81f6-b883b281b28e/config-reloader/0.log" Apr 16 17:16:06.283174 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.283152 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8eacc20-f29a-4f10-81f6-b883b281b28e/thanos-sidecar/0.log" Apr 16 17:16:06.304521 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.304497 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8eacc20-f29a-4f10-81f6-b883b281b28e/kube-rbac-proxy-web/0.log" Apr 16 17:16:06.324058 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.324033 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8eacc20-f29a-4f10-81f6-b883b281b28e/kube-rbac-proxy/0.log" Apr 16 17:16:06.344021 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.343999 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8eacc20-f29a-4f10-81f6-b883b281b28e/kube-rbac-proxy-thanos/0.log" Apr 16 17:16:06.363051 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.363029 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8eacc20-f29a-4f10-81f6-b883b281b28e/init-config-reloader/0.log" Apr 16 17:16:06.482576 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.482548 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f9bf64dc9-4xqt7_3dda195a-5e85-4304-a192-8af66801e3e5/telemeter-client/0.log" Apr 16 17:16:06.505512 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.505491 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f9bf64dc9-4xqt7_3dda195a-5e85-4304-a192-8af66801e3e5/reload/0.log" Apr 16 17:16:06.528779 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.528723 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f9bf64dc9-4xqt7_3dda195a-5e85-4304-a192-8af66801e3e5/kube-rbac-proxy/0.log" Apr 16 17:16:06.567795 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.567769 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-848496fcb-44nmq_5720f2ab-ce1b-4958-97fb-6899db6301a3/thanos-query/0.log" Apr 16 17:16:06.593572 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.593549 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-848496fcb-44nmq_5720f2ab-ce1b-4958-97fb-6899db6301a3/kube-rbac-proxy-web/0.log" Apr 16 17:16:06.616604 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.616585 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-848496fcb-44nmq_5720f2ab-ce1b-4958-97fb-6899db6301a3/kube-rbac-proxy/0.log" Apr 16 17:16:06.641353 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.641331 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-848496fcb-44nmq_5720f2ab-ce1b-4958-97fb-6899db6301a3/prom-label-proxy/0.log" Apr 16 17:16:06.661642 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.661623 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-848496fcb-44nmq_5720f2ab-ce1b-4958-97fb-6899db6301a3/kube-rbac-proxy-rules/0.log" Apr 16 17:16:06.683283 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:06.683258 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-848496fcb-44nmq_5720f2ab-ce1b-4958-97fb-6899db6301a3/kube-rbac-proxy-metrics/0.log" Apr 16 17:16:08.786325 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:08.786300 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75f64d5d79-lg9d9_fa23b615-dc8a-4c69-9454-d85ee27b097b/console/0.log" Apr 16 17:16:09.214765 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.214729 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r"] Apr 16 17:16:09.215320 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.215292 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" containerName="storage-initializer" Apr 16 17:16:09.215320 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.215312 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" containerName="storage-initializer" Apr 16 17:16:09.215478 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.215354 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" containerName="main" Apr 16 17:16:09.215478 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.215363 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" containerName="main" Apr 16 17:16:09.215478 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.215466 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0f2b6f3-4126-492f-bb7f-b5c4680f7d6b" containerName="main" Apr 16 17:16:09.218762 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.218747 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.221776 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.221734 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-msdj2\"/\"openshift-service-ca.crt\"" Apr 16 17:16:09.223498 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.223437 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-msdj2\"/\"default-dockercfg-4rjqv\"" Apr 16 17:16:09.223671 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.223633 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-msdj2\"/\"kube-root-ca.crt\"" Apr 16 17:16:09.226344 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.226323 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r"] Apr 16 17:16:09.374646 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.374613 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fe30b88a-1a03-41fe-8c74-be229ad44d9c-proc\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.374815 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.374706 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe30b88a-1a03-41fe-8c74-be229ad44d9c-sys\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.374815 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.374811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fe30b88a-1a03-41fe-8c74-be229ad44d9c-podres\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.374949 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.374881 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe30b88a-1a03-41fe-8c74-be229ad44d9c-lib-modules\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.374949 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.374913 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpfrc\" (UniqueName: \"kubernetes.io/projected/fe30b88a-1a03-41fe-8c74-be229ad44d9c-kube-api-access-cpfrc\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.475561 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.475468 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fe30b88a-1a03-41fe-8c74-be229ad44d9c-podres\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.475561 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.475516 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe30b88a-1a03-41fe-8c74-be229ad44d9c-lib-modules\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.475561 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.475537 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpfrc\" (UniqueName: \"kubernetes.io/projected/fe30b88a-1a03-41fe-8c74-be229ad44d9c-kube-api-access-cpfrc\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.475809 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.475586 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fe30b88a-1a03-41fe-8c74-be229ad44d9c-proc\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.475809 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.475616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe30b88a-1a03-41fe-8c74-be229ad44d9c-sys\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.475809 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.475669 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fe30b88a-1a03-41fe-8c74-be229ad44d9c-podres\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.475809 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.475673 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe30b88a-1a03-41fe-8c74-be229ad44d9c-lib-modules\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.475950 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.475846 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe30b88a-1a03-41fe-8c74-be229ad44d9c-sys\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.475950 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.475893 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fe30b88a-1a03-41fe-8c74-be229ad44d9c-proc\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.483410 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.483387 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpfrc\" (UniqueName: \"kubernetes.io/projected/fe30b88a-1a03-41fe-8c74-be229ad44d9c-kube-api-access-cpfrc\") pod \"perf-node-gather-daemonset-jhx6r\" (UID: \"fe30b88a-1a03-41fe-8c74-be229ad44d9c\") " pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.529091 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.529065 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.644780 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.644758 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r"] Apr 16 17:16:09.646433 ip-10-0-143-10 kubenswrapper[2574]: W0416 17:16:09.646402 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfe30b88a_1a03_41fe_8c74_be229ad44d9c.slice/crio-9019db0f7defcf22ab105ef67bbcc367b2108399bd3c48f1ab852411048ba1d9 WatchSource:0}: Error finding container 9019db0f7defcf22ab105ef67bbcc367b2108399bd3c48f1ab852411048ba1d9: Status 404 returned error can't find the container with id 9019db0f7defcf22ab105ef67bbcc367b2108399bd3c48f1ab852411048ba1d9 Apr 16 17:16:09.824547 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.824513 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" event={"ID":"fe30b88a-1a03-41fe-8c74-be229ad44d9c","Type":"ContainerStarted","Data":"51bb99068f95c2c5fd48793a4d18f7b4244c3aae505596fdbf185613e378f01d"} Apr 16 17:16:09.825028 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.824553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" event={"ID":"fe30b88a-1a03-41fe-8c74-be229ad44d9c","Type":"ContainerStarted","Data":"9019db0f7defcf22ab105ef67bbcc367b2108399bd3c48f1ab852411048ba1d9"} Apr 16 17:16:09.825028 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.824625 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:09.841079 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.841038 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" podStartSLOduration=0.841022404 podStartE2EDuration="841.022404ms" podCreationTimestamp="2026-04-16 17:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:16:09.838712759 +0000 UTC m=+1652.121738338" watchObservedRunningTime="2026-04-16 17:16:09.841022404 +0000 UTC m=+1652.124047983" Apr 16 17:16:09.992387 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:09.992358 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j89fm_75a19bee-ffb4-449a-b349-f0422cebce13/dns/0.log" Apr 16 17:16:10.017411 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:10.017384 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j89fm_75a19bee-ffb4-449a-b349-f0422cebce13/kube-rbac-proxy/0.log" Apr 16 17:16:10.118555 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:10.118476 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bmj4j_131ec79c-6a30-4f10-b4b5-c529479f0fe0/dns-node-resolver/0.log" Apr 16 17:16:10.657995 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:10.657967 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jndw7_3a0b88db-3420-4832-821b-dc3272b66858/node-ca/0.log" Apr 16 17:16:11.435979 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:11.435944 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-tvvcc_e9dbcc4d-23f9-4e47-bde8-9d690265fbfd/discovery/0.log" Apr 16 17:16:11.940488 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:11.940461 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jrwq2_28d3ee6e-fb67-4adb-a55e-7fcafca89c2b/serve-healthcheck-canary/0.log" Apr 16 17:16:12.521864 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:12.521835 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-t9n8c_6b60b842-0962-4590-bd23-8e0739622ebd/kube-rbac-proxy/0.log" Apr 16 17:16:12.539717 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:12.539696 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-t9n8c_6b60b842-0962-4590-bd23-8e0739622ebd/exporter/0.log" Apr 16 17:16:12.558040 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:12.558014 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-t9n8c_6b60b842-0962-4590-bd23-8e0739622ebd/extractor/0.log" Apr 16 17:16:14.953534 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:14.953505 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-65f5d85b79-d446b_1c0b40c6-60f1-4cea-9c1d-ac6b2c78a3ee/manager/0.log" Apr 16 17:16:15.538930 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:15.538897 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-7b5bf88c88-bd28w_6a3f98cd-72b0-4b09-9c9a-d4c789d1fe59/manager/0.log" Apr 16 17:16:15.838198 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:15.838170 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-msdj2/perf-node-gather-daemonset-jhx6r" Apr 16 17:16:21.634521 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:21.634496 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7dxf7_c7b5620d-256f-4b38-ac42-4979da7007a4/kube-multus/0.log" Apr 16 17:16:21.689726 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:21.689704 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-72gts_fabe57c2-0e2b-42e1-9322-9ea7c5a3f719/kube-multus-additional-cni-plugins/0.log" Apr 16 17:16:21.707834 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:21.707810 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-72gts_fabe57c2-0e2b-42e1-9322-9ea7c5a3f719/egress-router-binary-copy/0.log" Apr 16 17:16:21.728114 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:21.728090 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-72gts_fabe57c2-0e2b-42e1-9322-9ea7c5a3f719/cni-plugins/0.log" Apr 16 17:16:21.746644 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:21.746620 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-72gts_fabe57c2-0e2b-42e1-9322-9ea7c5a3f719/bond-cni-plugin/0.log" Apr 16 17:16:21.764624 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:21.764604 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-72gts_fabe57c2-0e2b-42e1-9322-9ea7c5a3f719/routeoverride-cni/0.log" Apr 16 17:16:21.783973 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:21.783953 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-72gts_fabe57c2-0e2b-42e1-9322-9ea7c5a3f719/whereabouts-cni-bincopy/0.log" Apr 16 17:16:21.801534 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:21.801512 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-72gts_fabe57c2-0e2b-42e1-9322-9ea7c5a3f719/whereabouts-cni/0.log" Apr 16 17:16:22.105151 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:22.105124 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-62nv6_7c7749f8-7b64-4062-9bca-90c0826a9692/network-metrics-daemon/0.log" Apr 16 17:16:22.124627 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:22.124606 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-62nv6_7c7749f8-7b64-4062-9bca-90c0826a9692/kube-rbac-proxy/0.log" Apr 16 17:16:22.996297 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:22.996271 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-controller/0.log" Apr 16 17:16:23.028492 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:23.028466 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/0.log" Apr 16 17:16:23.035963 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:23.035942 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovn-acl-logging/1.log" Apr 16 17:16:23.050882 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:23.050864 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/kube-rbac-proxy-node/0.log" Apr 16 17:16:23.069421 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:23.069399 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 17:16:23.088331 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:23.088307 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/northd/0.log" Apr 16 17:16:23.107105 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:23.107086 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/nbdb/0.log" Apr 16 17:16:23.127204 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:23.127187 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/sbdb/0.log" Apr 16 17:16:23.229538 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:23.229511 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2fwqq_a3db0e84-d73d-4e5b-a7c2-94290b442748/ovnkube-controller/0.log" Apr 16 17:16:24.902390 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:24.902364 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-cqm4l_0b5fcafd-70a7-4e76-ba7a-022cfee37811/network-check-target-container/0.log" Apr 16 17:16:25.940043 ip-10-0-143-10 kubenswrapper[2574]: I0416 17:16:25.940016 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-zjk6l_75e2551e-500a-44ad-90a9-c6ee9b976f48/iptables-alerter/0.log"