Apr 23 17:41:21.140291 ip-10-0-143-218 systemd[1]: Starting Kubernetes Kubelet... Apr 23 17:41:21.562646 ip-10-0-143-218 kubenswrapper[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:41:21.562646 ip-10-0-143-218 kubenswrapper[2581]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 17:41:21.562646 ip-10-0-143-218 kubenswrapper[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:41:21.562646 ip-10-0-143-218 kubenswrapper[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 17:41:21.562646 ip-10-0-143-218 kubenswrapper[2581]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:41:21.564743 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.564655 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 17:41:21.569351 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569324 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:21.569351 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569348 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:21.569351 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569353 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:21.569351 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569357 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569361 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569366 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569369 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569373 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569376 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569379 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569383 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569387 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569390 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569394 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569398 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569402 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569406 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569409 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569413 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569417 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569421 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569424 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569428 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:21.569625 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569432 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569436 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569439 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569443 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569447 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569451 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569455 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569459 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569465 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569471 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569475 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569479 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569482 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569486 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569494 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569499 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569502 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569506 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569511 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:21.570450 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569516 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569520 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569524 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569530 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569534 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569538 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569542 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569546 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569550 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569555 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569559 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569563 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569567 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569571 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569575 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569580 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569583 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569590 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569598 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:21.571419 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569602 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569606 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569611 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569615 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569621 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569625 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569629 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569633 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569637 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569642 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569646 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569651 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569655 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569660 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569664 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569668 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569672 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569678 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569683 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569687 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:21.572244 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569691 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569695 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569699 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569703 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.569707 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570417 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570428 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570433 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570438 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570443 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570447 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570451 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570455 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570459 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570464 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570468 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570472 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570477 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570481 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570486 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:21.572778 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570491 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570496 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570501 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570505 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570509 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570514 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570518 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570522 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570527 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570533 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570537 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570542 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570546 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570550 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570554 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570558 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570562 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570567 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570571 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570576 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:21.573406 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570580 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570585 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570589 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570593 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570597 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570601 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570605 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570609 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570613 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570617 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570621 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570625 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570630 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570634 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570640 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570645 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570649 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570653 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570657 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570661 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:21.574187 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570665 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570668 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570673 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570677 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570681 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570684 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570688 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570692 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570695 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570699 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570703 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570706 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570710 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570717 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570722 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570726 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570731 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570736 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570740 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:21.574713 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570745 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570749 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570753 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570756 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570760 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570765 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570773 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570779 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570784 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570788 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570792 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.570796 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.570912 2581 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.570923 2581 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.570932 2581 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.570939 2581 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.570947 2581 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.570952 2581 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.570959 2581 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.570966 2581 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.570971 2581 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 17:41:21.575227 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.570976 2581 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571005 2581 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571011 2581 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571016 2581 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571020 2581 flags.go:64] FLAG: --cgroup-root="" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571025 2581 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571030 2581 flags.go:64] FLAG: --client-ca-file="" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571035 2581 flags.go:64] FLAG: --cloud-config="" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571040 2581 flags.go:64] FLAG: --cloud-provider="external" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571045 2581 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571053 2581 flags.go:64] FLAG: --cluster-domain="" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571057 2581 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571063 2581 flags.go:64] FLAG: --config-dir="" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571068 2581 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571073 2581 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571080 2581 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571085 2581 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571090 2581 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571096 2581 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571101 2581 flags.go:64] FLAG: --contention-profiling="false" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571105 2581 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571110 2581 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571115 2581 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571120 2581 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571126 2581 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 17:41:21.575806 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571132 2581 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571137 2581 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571142 2581 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571148 2581 flags.go:64] FLAG: --enable-server="true" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571153 2581 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571160 2581 flags.go:64] FLAG: --event-burst="100" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571165 2581 flags.go:64] FLAG: --event-qps="50" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571170 2581 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571175 2581 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571180 2581 flags.go:64] FLAG: --eviction-hard="" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571186 2581 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571192 2581 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571196 2581 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571201 2581 flags.go:64] FLAG: --eviction-soft="" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571206 2581 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571211 2581 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571215 2581 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571220 2581 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571225 2581 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571229 2581 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571234 2581 flags.go:64] FLAG: --feature-gates="" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571239 2581 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571244 2581 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571258 2581 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571264 2581 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571269 2581 flags.go:64] FLAG: --healthz-port="10248" Apr 23 17:41:21.576459 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571274 2581 flags.go:64] FLAG: --help="false" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571279 2581 flags.go:64] FLAG: --hostname-override="ip-10-0-143-218.ec2.internal" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571284 2581 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571290 2581 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571294 2581 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571300 2581 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571306 2581 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571311 2581 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571316 2581 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571320 2581 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571326 2581 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571331 2581 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571336 2581 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571340 2581 flags.go:64] FLAG: --kube-reserved="" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571345 2581 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571350 2581 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571355 2581 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571360 2581 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571365 2581 flags.go:64] FLAG: --lock-file="" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571370 2581 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571375 2581 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571379 2581 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571389 2581 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 17:41:21.577127 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571394 2581 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571398 2581 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571403 2581 flags.go:64] FLAG: --logging-format="text" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571407 2581 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571413 2581 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571418 2581 flags.go:64] FLAG: --manifest-url="" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571425 2581 flags.go:64] FLAG: --manifest-url-header="" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571432 2581 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571437 2581 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571443 2581 flags.go:64] FLAG: --max-pods="110" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571448 2581 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571453 2581 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571458 2581 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571462 2581 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571467 2581 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571475 2581 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571480 2581 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571492 2581 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571497 2581 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571501 2581 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571507 2581 flags.go:64] FLAG: --pod-cidr="" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571512 2581 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571520 2581 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571525 2581 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 17:41:21.577667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571530 2581 flags.go:64] FLAG: --pods-per-core="0" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571534 2581 flags.go:64] FLAG: --port="10250" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571539 2581 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571544 2581 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-065b97e80c138ff18" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571549 2581 flags.go:64] FLAG: --qos-reserved="" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571554 2581 flags.go:64] FLAG: --read-only-port="10255" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571559 2581 flags.go:64] FLAG: --register-node="true" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571563 2581 flags.go:64] FLAG: --register-schedulable="true" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571568 2581 flags.go:64] FLAG: --register-with-taints="" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571574 2581 flags.go:64] FLAG: --registry-burst="10" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571579 2581 flags.go:64] FLAG: --registry-qps="5" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571584 2581 flags.go:64] FLAG: --reserved-cpus="" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571589 2581 flags.go:64] FLAG: --reserved-memory="" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571594 2581 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571599 2581 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571605 2581 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571610 2581 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571615 2581 flags.go:64] FLAG: --runonce="false" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571620 2581 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571625 2581 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571630 2581 flags.go:64] FLAG: --seccomp-default="false" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571634 2581 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571639 2581 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571645 2581 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571650 2581 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571655 2581 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 17:41:21.578275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571660 2581 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571665 2581 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571670 2581 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571676 2581 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571682 2581 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571686 2581 flags.go:64] FLAG: --system-cgroups="" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571690 2581 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571700 2581 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571704 2581 flags.go:64] FLAG: --tls-cert-file="" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571709 2581 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571715 2581 flags.go:64] FLAG: --tls-min-version="" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571720 2581 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571725 2581 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571729 2581 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571734 2581 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571739 2581 flags.go:64] FLAG: --v="2" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571746 2581 flags.go:64] FLAG: --version="false" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571752 2581 flags.go:64] FLAG: --vmodule="" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571759 2581 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.571764 2581 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.571914 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.571923 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.571927 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.571932 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:21.579225 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.571937 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.571942 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.571946 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.571951 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.571956 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.571961 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.571968 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.571973 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.571978 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572006 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572011 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572016 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572027 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572032 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572036 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572041 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572045 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572049 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572053 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:21.580243 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572057 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572062 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572067 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572071 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572075 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572079 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572083 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572087 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572091 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572095 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572103 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572107 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572111 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572115 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572120 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572124 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572128 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572132 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572138 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572142 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:21.581162 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572146 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572151 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572157 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572163 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572168 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572172 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572177 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572182 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572186 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572190 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572194 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572198 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572202 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572206 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572210 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572214 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572218 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572222 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572226 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:21.581660 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572230 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572234 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572238 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572246 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572249 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572253 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572257 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572261 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572265 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572270 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572274 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572281 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572285 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572290 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572294 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572298 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572302 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572306 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572310 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572314 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:21.582399 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572320 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572324 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572329 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.572333 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.572936 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.581796 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.582027 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582115 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582123 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582128 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582134 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582139 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582143 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582148 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582152 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:21.583263 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582157 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582161 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582165 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582169 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582174 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582178 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582182 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582186 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582190 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582195 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582199 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582203 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582207 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582211 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582216 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582220 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582224 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582228 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582232 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582236 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:21.583662 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582241 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582245 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582249 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582255 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582259 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582264 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582268 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582272 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582278 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582284 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582289 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582293 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582297 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582302 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582306 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582310 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582314 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582318 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582322 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:21.584267 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582326 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582331 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582335 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582339 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582343 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582347 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582351 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582356 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582361 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582365 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582369 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582374 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582378 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582382 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582386 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582393 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582398 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582403 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582408 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:21.584949 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582413 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582417 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582421 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582426 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582430 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582434 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582438 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582443 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582447 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582452 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582456 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582460 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582464 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582468 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582472 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582476 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582480 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582485 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582489 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:21.585483 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582493 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.582502 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582673 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582683 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582688 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582693 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582698 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582702 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582709 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582716 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582721 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582726 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582732 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582736 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582741 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:21.586033 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582745 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582749 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582753 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582757 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582761 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582765 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582769 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582774 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582778 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582782 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582786 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582790 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582794 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582798 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582803 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582807 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582812 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582815 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582819 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582824 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:21.586648 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582829 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582833 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582837 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582841 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582845 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582849 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582853 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582858 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582862 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582867 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582872 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582878 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582882 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582886 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582891 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582895 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582900 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582904 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582908 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:21.587401 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582913 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582917 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582921 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582925 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582929 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582934 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582938 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582942 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582946 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582950 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582954 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582959 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582963 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582968 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582972 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582976 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.582999 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583003 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583007 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583011 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:21.588050 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583015 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583019 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583023 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583028 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583033 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583037 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583041 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583045 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583049 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583053 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583058 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583062 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583066 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:21.583070 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.583078 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:41:21.588544 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.583900 2581 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 17:41:21.589009 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.587679 2581 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 17:41:21.589009 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.588607 2581 server.go:1019] "Starting client certificate rotation" Apr 23 17:41:21.589009 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.588709 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:41:21.589009 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.588752 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:41:21.612948 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.612918 2581 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:41:21.614738 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.614709 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:41:21.632390 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.632365 2581 log.go:25] "Validated CRI v1 runtime API" Apr 23 17:41:21.637428 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.637411 2581 log.go:25] "Validated CRI v1 image API" Apr 23 17:41:21.638714 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.638698 2581 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 17:41:21.641488 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.641472 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:41:21.646940 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.646920 2581 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 830606a0-4647-42de-86fd-99f1ac130682:/dev/nvme0n1p4 ee1520eb-235a-4faf-a44f-e8fc0da6e789:/dev/nvme0n1p3] Apr 23 17:41:21.647017 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.646940 2581 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 17:41:21.652645 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.652528 2581 manager.go:217] Machine: {Timestamp:2026-04-23 17:41:21.650724142 +0000 UTC m=+0.400165934 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099380 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d0591b8bde213fc743a9edd23786f SystemUUID:ec2d0591-b8bd-e213-fc74-3a9edd23786f BootID:dfb17b04-927e-44f9-be58-bb7c4a8e499c Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2b:0f:fe:53:45 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2b:0f:fe:53:45 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fa:94:ea:66:cf:19 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 17:41:21.652645 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.652640 2581 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 17:41:21.652769 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.652758 2581 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 17:41:21.653677 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.653652 2581 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 17:41:21.653818 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.653680 2581 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-218.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 17:41:21.653863 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.653828 2581 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 17:41:21.653863 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.653836 2581 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 17:41:21.653863 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.653854 2581 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:41:21.654653 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.654642 2581 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:41:21.655910 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.655900 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:41:21.656031 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.656022 2581 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 17:41:21.658298 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.658288 2581 kubelet.go:491] "Attempting to sync node with API server" Apr 23 17:41:21.658339 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.658303 2581 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 17:41:21.658339 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.658314 2581 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 17:41:21.658339 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.658323 2581 kubelet.go:397] "Adding apiserver pod source" Apr 23 17:41:21.658339 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.658332 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 17:41:21.659366 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.659353 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:41:21.659429 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.659380 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:41:21.662204 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.662189 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 17:41:21.664191 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.664177 2581 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 17:41:21.665344 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.665331 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 17:41:21.665388 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.665353 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 17:41:21.665388 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.665360 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 17:41:21.665388 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.665365 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 17:41:21.665388 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.665370 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 17:41:21.665388 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.665376 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 17:41:21.665388 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.665382 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 17:41:21.665388 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.665386 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 17:41:21.665571 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.665394 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 17:41:21.665571 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.665401 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 17:41:21.665571 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.665415 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 17:41:21.665571 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.665423 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 17:41:21.666282 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.666272 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 17:41:21.666321 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.666285 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 17:41:21.670357 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.670344 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 17:41:21.670399 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.670389 2581 server.go:1295] "Started kubelet" Apr 23 17:41:21.670494 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.670468 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 17:41:21.670597 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.670545 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 17:41:21.670653 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.670627 2581 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 17:41:21.672212 ip-10-0-143-218 systemd[1]: Started Kubernetes Kubelet. Apr 23 17:41:21.673387 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.672935 2581 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-218.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:41:21.673387 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.673031 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:41:21.673387 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.673072 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 17:41:21.675102 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.675062 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-218.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:41:21.675902 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.675436 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 23 17:41:21.681199 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.681182 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 17:41:21.681199 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.681191 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 17:41:21.682077 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.682062 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:21.682516 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.681763 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-218.ec2.internal.18a90d3c8be1272a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-218.ec2.internal,UID:ip-10-0-143-218.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-218.ec2.internal,},FirstTimestamp:2026-04-23 17:41:21.670358826 +0000 UTC m=+0.419800815,LastTimestamp:2026-04-23 17:41:21.670358826 +0000 UTC m=+0.419800815,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-218.ec2.internal,}" Apr 23 17:41:21.682666 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.682636 2581 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 17:41:21.682666 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.682667 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 17:41:21.682812 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.682756 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 17:41:21.682812 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.682809 2581 factory.go:55] Registering systemd factory Apr 23 17:41:21.682902 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.682820 2581 reconstruct.go:97] "Volume reconstruction finished" Apr 23 17:41:21.682902 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.682829 2581 reconciler.go:26] "Reconciler: start to sync state" Apr 23 17:41:21.682902 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.682822 2581 factory.go:223] Registration of the systemd container factory successfully Apr 23 17:41:21.683074 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.683060 2581 factory.go:153] Registering CRI-O factory Apr 23 17:41:21.683074 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.683074 2581 factory.go:223] Registration of the crio container factory successfully Apr 23 17:41:21.683163 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.683127 2581 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 17:41:21.683163 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.683152 2581 factory.go:103] Registering Raw factory Apr 23 17:41:21.683253 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.683168 2581 manager.go:1196] Started watching for new ooms in manager Apr 23 17:41:21.683449 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.683425 2581 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 17:41:21.683587 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.683575 2581 manager.go:319] Starting recovery of all containers Apr 23 17:41:21.694764 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.694732 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:41:21.694764 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.694751 2581 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-218.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 17:41:21.694940 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.694872 2581 manager.go:324] Recovery completed Apr 23 17:41:21.695032 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.694972 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mrnph" Apr 23 17:41:21.699465 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.699451 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:21.703059 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.703043 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:21.703116 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.703075 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:21.703116 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.703085 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:21.703566 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.703551 2581 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 17:41:21.703566 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.703566 2581 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 17:41:21.703667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.703586 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:41:21.703667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.703584 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mrnph" Apr 23 17:41:21.706828 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.706815 2581 policy_none.go:49] "None policy: Start" Apr 23 17:41:21.706884 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.706831 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 17:41:21.706884 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.706841 2581 state_mem.go:35] "Initializing new in-memory state store" Apr 23 17:41:21.748624 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.748604 2581 manager.go:341] "Starting Device Plugin manager" Apr 23 17:41:21.755033 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.748645 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 17:41:21.755033 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.748659 2581 server.go:85] "Starting device plugin registration server" Apr 23 17:41:21.755033 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.749031 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 17:41:21.755033 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.749043 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 17:41:21.755033 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.749124 2581 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 17:41:21.755033 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.749237 2581 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 17:41:21.755033 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.749246 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 17:41:21.755033 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.749759 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 17:41:21.755033 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.749791 2581 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:21.784386 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.784354 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 17:41:21.785679 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.785665 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 17:41:21.785749 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.785690 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 17:41:21.785749 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.785710 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 17:41:21.785749 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.785716 2581 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 17:41:21.785749 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.785747 2581 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 17:41:21.788760 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.788743 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:21.849665 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.849574 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:21.850606 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.850589 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:21.850671 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.850627 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:21.850671 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.850639 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:21.850671 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.850664 2581 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-218.ec2.internal" Apr 23 17:41:21.861196 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.861180 2581 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-218.ec2.internal" Apr 23 17:41:21.861259 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.861201 2581 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-218.ec2.internal\": node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:21.884177 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.884157 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:21.886333 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.886320 2581 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-218.ec2.internal"] Apr 23 17:41:21.886379 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.886372 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:21.887705 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.887691 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:21.887766 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.887718 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:21.887766 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.887734 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:21.888959 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.888946 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:21.889124 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.889112 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal" Apr 23 17:41:21.889161 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.889140 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:21.889626 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.889612 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:21.889681 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.889633 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:21.889681 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.889645 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:21.889755 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.889633 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:21.889793 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.889753 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:21.889793 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.889768 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:21.890757 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.890744 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-218.ec2.internal" Apr 23 17:41:21.890800 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.890768 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:21.891422 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.891409 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:21.891486 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.891434 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:21.891486 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.891443 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:21.912619 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.912595 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-218.ec2.internal\" not found" node="ip-10-0-143-218.ec2.internal" Apr 23 17:41:21.916768 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.916751 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-218.ec2.internal\" not found" node="ip-10-0-143-218.ec2.internal" Apr 23 17:41:21.983964 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.983939 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ad0c906f995d60a9073fbd94787bfd9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal\" (UID: \"4ad0c906f995d60a9073fbd94787bfd9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal" Apr 23 17:41:21.984065 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.983967 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0686775b0acf60aa77b819482efe0489-config\") pod \"kube-apiserver-proxy-ip-10-0-143-218.ec2.internal\" (UID: \"0686775b0acf60aa77b819482efe0489\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-218.ec2.internal" Apr 23 17:41:21.984065 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:21.984000 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4ad0c906f995d60a9073fbd94787bfd9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal\" (UID: \"4ad0c906f995d60a9073fbd94787bfd9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal" Apr 23 17:41:21.984967 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:21.984954 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:22.084400 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.084371 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ad0c906f995d60a9073fbd94787bfd9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal\" (UID: \"4ad0c906f995d60a9073fbd94787bfd9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal" Apr 23 17:41:22.084476 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.084405 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0686775b0acf60aa77b819482efe0489-config\") pod \"kube-apiserver-proxy-ip-10-0-143-218.ec2.internal\" (UID: \"0686775b0acf60aa77b819482efe0489\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-218.ec2.internal" Apr 23 17:41:22.084476 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.084422 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4ad0c906f995d60a9073fbd94787bfd9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal\" (UID: \"4ad0c906f995d60a9073fbd94787bfd9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal" Apr 23 17:41:22.084536 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.084477 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0686775b0acf60aa77b819482efe0489-config\") pod \"kube-apiserver-proxy-ip-10-0-143-218.ec2.internal\" (UID: \"0686775b0acf60aa77b819482efe0489\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-218.ec2.internal" Apr 23 17:41:22.084536 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.084495 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ad0c906f995d60a9073fbd94787bfd9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal\" (UID: \"4ad0c906f995d60a9073fbd94787bfd9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal" Apr 23 17:41:22.084536 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.084527 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4ad0c906f995d60a9073fbd94787bfd9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal\" (UID: \"4ad0c906f995d60a9073fbd94787bfd9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal" Apr 23 17:41:22.085406 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:22.085391 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:22.186215 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:22.186150 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:22.214337 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.214318 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal" Apr 23 17:41:22.219070 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.219054 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-218.ec2.internal" Apr 23 17:41:22.287072 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:22.287035 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:22.387561 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:22.387526 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:22.488099 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:22.488018 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:22.588662 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:22.588623 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:22.588662 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.588626 2581 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 17:41:22.589138 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.588766 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:41:22.681726 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.681693 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 17:41:22.689190 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:22.689169 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:22.699855 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.699836 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:41:22.706097 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.706073 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 17:36:21 +0000 UTC" deadline="2027-11-27 18:09:56.697743226 +0000 UTC" Apr 23 17:41:22.706097 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.706095 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13992h28m33.991651056s" Apr 23 17:41:22.723979 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.723950 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dkkdw" Apr 23 17:41:22.732467 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.732441 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dkkdw" Apr 23 17:41:22.763122 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.763100 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:22.789806 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:22.789784 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:22.852122 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:22.852083 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0686775b0acf60aa77b819482efe0489.slice/crio-5e8255a45d0f131ba72f8fa9326960d6bbd8793354e020b078e049a74f812040 WatchSource:0}: Error finding container 5e8255a45d0f131ba72f8fa9326960d6bbd8793354e020b078e049a74f812040: Status 404 returned error can't find the container with id 5e8255a45d0f131ba72f8fa9326960d6bbd8793354e020b078e049a74f812040 Apr 23 17:41:22.852558 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:22.852534 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ad0c906f995d60a9073fbd94787bfd9.slice/crio-0f1913c14d17166528d31cef52d9bc1da6ddc0e2d2c70602d16719ca1a21a621 WatchSource:0}: Error finding container 0f1913c14d17166528d31cef52d9bc1da6ddc0e2d2c70602d16719ca1a21a621: Status 404 returned error can't find the container with id 0f1913c14d17166528d31cef52d9bc1da6ddc0e2d2c70602d16719ca1a21a621 Apr 23 17:41:22.857179 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.857163 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:41:22.890151 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:22.890125 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:22.988508 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:22.988484 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:22.990915 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:22.990897 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:23.091460 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:23.091404 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:23.192187 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:23.192155 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-218.ec2.internal\" not found" Apr 23 17:41:23.276296 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.276270 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:23.281907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.281881 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal" Apr 23 17:41:23.296996 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.296959 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:41:23.297837 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.297815 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-218.ec2.internal" Apr 23 17:41:23.330123 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.330091 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:41:23.659525 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.659493 2581 apiserver.go:52] "Watching apiserver" Apr 23 17:41:23.669212 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.669181 2581 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 17:41:23.671259 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.671226 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-6lc2b","openshift-multus/multus-9c58k","openshift-multus/multus-additional-cni-plugins-xcnjl","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal","openshift-multus/network-metrics-daemon-dtrg9","openshift-network-diagnostics/network-check-target-9p89h","openshift-network-operator/iptables-alerter-wnkrl","openshift-ovn-kubernetes/ovnkube-node-b27s8","kube-system/konnectivity-agent-w6l25","kube-system/kube-apiserver-proxy-ip-10-0-143-218.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7","openshift-cluster-node-tuning-operator/tuned-ltb6d","openshift-dns/node-resolver-mtx92"] Apr 23 17:41:23.674557 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.674534 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wnkrl" Apr 23 17:41:23.676137 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.675701 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w6l25" Apr 23 17:41:23.677703 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.677325 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mtx92" Apr 23 17:41:23.678191 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.677965 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:41:23.678191 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.677997 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 17:41:23.678191 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.678112 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-92cd5\"" Apr 23 17:41:23.678821 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.678492 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.678821 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.678560 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.678821 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.678569 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-kbqfn\"" Apr 23 17:41:23.678821 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.678666 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 17:41:23.678821 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.678569 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 17:41:23.679373 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.679355 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 17:41:23.679854 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.679837 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 17:41:23.680159 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.680145 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 17:41:23.680237 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.680166 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-s95x9\"" Apr 23 17:41:23.680959 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.680944 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 17:41:23.681188 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.681173 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.681281 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.681269 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.682080 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.682062 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:41:23.682182 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.682155 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 17:41:23.682715 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.682276 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 17:41:23.682715 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.682346 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-t7mfq\"" Apr 23 17:41:23.682715 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.682514 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 17:41:23.682917 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.682892 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-t6cwm\"" Apr 23 17:41:23.684006 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.683667 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6lc2b" Apr 23 17:41:23.684953 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.684423 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 17:41:23.684953 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.684452 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 17:41:23.684953 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.684620 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 17:41:23.684953 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.684658 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pm77z\"" Apr 23 17:41:23.684953 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.684686 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 17:41:23.684953 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.684835 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 17:41:23.684953 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.684841 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 17:41:23.685340 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.684964 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-cf5qv\"" Apr 23 17:41:23.685340 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.685295 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:23.685434 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:23.685371 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:23.686068 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.686048 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 17:41:23.686399 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.686234 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 17:41:23.686731 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.686568 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 17:41:23.686731 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.686646 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ktjg4\"" Apr 23 17:41:23.690674 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.689342 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.690674 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.690099 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:23.690674 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:23.690149 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:23.692297 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.692278 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.692972 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693068 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693151 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-wtfc9\"" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693224 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693324 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693379 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693438 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cc5f4800-7f3a-482a-8059-189a6072df5a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693469 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99p4t\" (UniqueName: \"kubernetes.io/projected/34a08e1b-c531-4efa-80dc-60379820b460-kube-api-access-99p4t\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693500 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fa6065cc-b8a5-43b6-b34c-abd80570d3dc-konnectivity-ca\") pod \"konnectivity-agent-w6l25\" (UID: \"fa6065cc-b8a5-43b6-b34c-abd80570d3dc\") " pod="kube-system/konnectivity-agent-w6l25" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693522 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-registration-dir\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693547 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whstv\" (UniqueName: \"kubernetes.io/projected/600448d4-6de5-4739-b9f6-281d9613622c-kube-api-access-whstv\") pod \"node-ca-6lc2b\" (UID: \"600448d4-6de5-4739-b9f6-281d9613622c\") " pod="openshift-image-registry/node-ca-6lc2b" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693577 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-os-release\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693608 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-run-k8s-cni-cncf-io\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693630 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-tuned\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693644 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-hostroot\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693718 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-multus-conf-dir\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693734 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-multus-socket-dir-parent\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693762 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5xtk\" (UniqueName: \"kubernetes.io/projected/cc5f4800-7f3a-482a-8059-189a6072df5a-kube-api-access-n5xtk\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693791 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:23.693901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693810 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-lib-modules\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693833 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc5f4800-7f3a-482a-8059-189a6072df5a-system-cni-dir\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693861 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/600448d4-6de5-4739-b9f6-281d9613622c-host\") pod \"node-ca-6lc2b\" (UID: \"600448d4-6de5-4739-b9f6-281d9613622c\") " pod="openshift-image-registry/node-ca-6lc2b" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693886 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e736d0c-df1a-43ad-8265-6b52f1993b03-tmp-dir\") pod \"node-resolver-mtx92\" (UID: \"8e736d0c-df1a-43ad-8265-6b52f1993b03\") " pod="openshift-dns/node-resolver-mtx92" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693908 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-device-dir\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693930 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-var-lib-kubelet\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693954 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-run-multus-certs\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.693976 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fthkc\" (UniqueName: \"kubernetes.io/projected/28f30ab5-30f5-42ed-881e-4af2ffff85dc-kube-api-access-fthkc\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694016 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-tmp\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694051 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-etc-kubernetes\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694118 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc5f4800-7f3a-482a-8059-189a6072df5a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694165 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27kfv\" (UniqueName: \"kubernetes.io/projected/8e736d0c-df1a-43ad-8265-6b52f1993b03-kube-api-access-27kfv\") pod \"node-resolver-mtx92\" (UID: \"8e736d0c-df1a-43ad-8265-6b52f1993b03\") " pod="openshift-dns/node-resolver-mtx92" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694188 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-etc-selinux\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694221 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-host\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694244 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-var-lib-kubelet\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694327 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf-iptables-alerter-script\") pod \"iptables-alerter-wnkrl\" (UID: \"0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf\") " pod="openshift-network-operator/iptables-alerter-wnkrl" Apr 23 17:41:23.694711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694387 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-socket-dir\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.695351 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694429 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-systemd\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.695351 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694460 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-var-lib-cni-bin\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.695351 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694501 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc5f4800-7f3a-482a-8059-189a6072df5a-cni-binary-copy\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.695351 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694530 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fa6065cc-b8a5-43b6-b34c-abd80570d3dc-agent-certs\") pod \"konnectivity-agent-w6l25\" (UID: \"fa6065cc-b8a5-43b6-b34c-abd80570d3dc\") " pod="kube-system/konnectivity-agent-w6l25" Apr 23 17:41:23.695351 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.694554 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-sysconfig\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.696036 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696009 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/600448d4-6de5-4739-b9f6-281d9613622c-serviceca\") pod \"node-ca-6lc2b\" (UID: \"600448d4-6de5-4739-b9f6-281d9613622c\") " pod="openshift-image-registry/node-ca-6lc2b" Apr 23 17:41:23.696120 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696067 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v27cq\" (UniqueName: \"kubernetes.io/projected/0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf-kube-api-access-v27cq\") pod \"iptables-alerter-wnkrl\" (UID: \"0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf\") " pod="openshift-network-operator/iptables-alerter-wnkrl" Apr 23 17:41:23.696120 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696108 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e736d0c-df1a-43ad-8265-6b52f1993b03-hosts-file\") pod \"node-resolver-mtx92\" (UID: \"8e736d0c-df1a-43ad-8265-6b52f1993b03\") " pod="openshift-dns/node-resolver-mtx92" Apr 23 17:41:23.696213 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696140 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-sysctl-conf\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.696213 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696177 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-run\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.696213 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696207 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-multus-cni-dir\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.696387 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696230 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-run-netns\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.696387 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696258 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-kubernetes\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.696387 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696288 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/28f30ab5-30f5-42ed-881e-4af2ffff85dc-multus-daemon-config\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.696666 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696317 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf-host-slash\") pod \"iptables-alerter-wnkrl\" (UID: \"0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf\") " pod="openshift-network-operator/iptables-alerter-wnkrl" Apr 23 17:41:23.696724 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696693 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kkwv\" (UniqueName: \"kubernetes.io/projected/dc9e19bd-f4ca-4fa7-8442-d784513fd798-kube-api-access-9kkwv\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.696774 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696729 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-var-lib-cni-multus\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.696774 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696757 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc5f4800-7f3a-482a-8059-189a6072df5a-os-release\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.696868 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696783 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.696868 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696811 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-sys-fs\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.696868 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696839 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-sysctl-d\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.696868 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696865 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-sys\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.697063 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696893 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc5f4800-7f3a-482a-8059-189a6072df5a-cnibin\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.697063 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696918 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc5f4800-7f3a-482a-8059-189a6072df5a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.697063 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696945 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-modprobe-d\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.697063 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.696973 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lv2h\" (UniqueName: \"kubernetes.io/projected/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-kube-api-access-2lv2h\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.697063 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.697019 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-system-cni-dir\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.697063 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.697058 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-cnibin\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.697311 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.697088 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28f30ab5-30f5-42ed-881e-4af2ffff85dc-cni-binary-copy\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.734968 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.734934 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:36:22 +0000 UTC" deadline="2028-02-04 19:15:49.654604924 +0000 UTC" Apr 23 17:41:23.734968 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.734968 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15649h34m25.919640545s" Apr 23 17:41:23.783933 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.783903 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 17:41:23.790319 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.790270 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-218.ec2.internal" event={"ID":"0686775b0acf60aa77b819482efe0489","Type":"ContainerStarted","Data":"5e8255a45d0f131ba72f8fa9326960d6bbd8793354e020b078e049a74f812040"} Apr 23 17:41:23.791706 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.791676 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal" event={"ID":"4ad0c906f995d60a9073fbd94787bfd9","Type":"ContainerStarted","Data":"0f1913c14d17166528d31cef52d9bc1da6ddc0e2d2c70602d16719ca1a21a621"} Apr 23 17:41:23.798208 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798181 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-var-lib-openvswitch\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.798306 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798227 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-etc-openvswitch\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.798306 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798250 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-log-socket\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.798306 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798275 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fa6065cc-b8a5-43b6-b34c-abd80570d3dc-agent-certs\") pod \"konnectivity-agent-w6l25\" (UID: \"fa6065cc-b8a5-43b6-b34c-abd80570d3dc\") " pod="kube-system/konnectivity-agent-w6l25" Apr 23 17:41:23.798306 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798296 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-sysconfig\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.798508 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798321 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/600448d4-6de5-4739-b9f6-281d9613622c-serviceca\") pod \"node-ca-6lc2b\" (UID: \"600448d4-6de5-4739-b9f6-281d9613622c\") " pod="openshift-image-registry/node-ca-6lc2b" Apr 23 17:41:23.798508 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798345 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-cni-netd\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.798508 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798370 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v27cq\" (UniqueName: \"kubernetes.io/projected/0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf-kube-api-access-v27cq\") pod \"iptables-alerter-wnkrl\" (UID: \"0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf\") " pod="openshift-network-operator/iptables-alerter-wnkrl" Apr 23 17:41:23.798508 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798391 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e736d0c-df1a-43ad-8265-6b52f1993b03-hosts-file\") pod \"node-resolver-mtx92\" (UID: \"8e736d0c-df1a-43ad-8265-6b52f1993b03\") " pod="openshift-dns/node-resolver-mtx92" Apr 23 17:41:23.798508 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798399 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-sysconfig\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.798508 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798458 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e736d0c-df1a-43ad-8265-6b52f1993b03-hosts-file\") pod \"node-resolver-mtx92\" (UID: \"8e736d0c-df1a-43ad-8265-6b52f1993b03\") " pod="openshift-dns/node-resolver-mtx92" Apr 23 17:41:23.798716 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798517 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-sysctl-conf\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.798716 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798582 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-run\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.798716 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798612 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-multus-cni-dir\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.798716 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798673 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-multus-cni-dir\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.798716 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798679 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-run\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.798928 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798727 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-run-netns\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.798928 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798769 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f664q\" (UniqueName: \"kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q\") pod \"network-check-target-9p89h\" (UID: \"93857015-7301-4fa2-b52a-008e134f997b\") " pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:23.798928 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798775 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-run-netns\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.798928 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798791 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-sysctl-conf\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.798928 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798809 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.798928 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798820 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/600448d4-6de5-4739-b9f6-281d9613622c-serviceca\") pod \"node-ca-6lc2b\" (UID: \"600448d4-6de5-4739-b9f6-281d9613622c\") " pod="openshift-image-registry/node-ca-6lc2b" Apr 23 17:41:23.798928 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798838 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-kubernetes\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.798928 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798860 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/28f30ab5-30f5-42ed-881e-4af2ffff85dc-multus-daemon-config\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.798928 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798882 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-run-openvswitch\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.798928 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798923 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a841a79-6082-4722-b8d8-9b6626187a1f-env-overrides\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.798947 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a841a79-6082-4722-b8d8-9b6626187a1f-ovn-node-metrics-cert\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799001 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-kubernetes\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799013 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgcxq\" (UniqueName: \"kubernetes.io/projected/7a841a79-6082-4722-b8d8-9b6626187a1f-kube-api-access-pgcxq\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799039 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf-host-slash\") pod \"iptables-alerter-wnkrl\" (UID: \"0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf\") " pod="openshift-network-operator/iptables-alerter-wnkrl" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799073 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kkwv\" (UniqueName: \"kubernetes.io/projected/dc9e19bd-f4ca-4fa7-8442-d784513fd798-kube-api-access-9kkwv\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799095 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf-host-slash\") pod \"iptables-alerter-wnkrl\" (UID: \"0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf\") " pod="openshift-network-operator/iptables-alerter-wnkrl" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799122 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-var-lib-cni-multus\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799139 2581 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799147 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc5f4800-7f3a-482a-8059-189a6072df5a-os-release\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799162 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-var-lib-cni-multus\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799226 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799233 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc5f4800-7f3a-482a-8059-189a6072df5a-os-release\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799172 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799271 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-sys-fs\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799296 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-sysctl-d\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799320 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-sys-fs\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.799391 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799325 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-sys\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799367 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-sys\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799369 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc5f4800-7f3a-482a-8059-189a6072df5a-cnibin\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799382 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/28f30ab5-30f5-42ed-881e-4af2ffff85dc-multus-daemon-config\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799395 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc5f4800-7f3a-482a-8059-189a6072df5a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799399 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc5f4800-7f3a-482a-8059-189a6072df5a-cnibin\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799421 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-sysctl-d\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799426 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-node-log\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799465 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-modprobe-d\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799492 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc5f4800-7f3a-482a-8059-189a6072df5a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799510 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lv2h\" (UniqueName: \"kubernetes.io/projected/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-kube-api-access-2lv2h\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799529 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-system-cni-dir\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799537 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-modprobe-d\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799544 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-cnibin\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799562 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28f30ab5-30f5-42ed-881e-4af2ffff85dc-cni-binary-copy\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799587 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cc5f4800-7f3a-482a-8059-189a6072df5a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799591 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-system-cni-dir\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.800177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799604 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-cnibin\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799716 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99p4t\" (UniqueName: \"kubernetes.io/projected/34a08e1b-c531-4efa-80dc-60379820b460-kube-api-access-99p4t\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799878 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-cni-bin\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799896 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fa6065cc-b8a5-43b6-b34c-abd80570d3dc-konnectivity-ca\") pod \"konnectivity-agent-w6l25\" (UID: \"fa6065cc-b8a5-43b6-b34c-abd80570d3dc\") " pod="kube-system/konnectivity-agent-w6l25" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799913 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-registration-dir\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799934 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whstv\" (UniqueName: \"kubernetes.io/projected/600448d4-6de5-4739-b9f6-281d9613622c-kube-api-access-whstv\") pod \"node-ca-6lc2b\" (UID: \"600448d4-6de5-4739-b9f6-281d9613622c\") " pod="openshift-image-registry/node-ca-6lc2b" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799950 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-kubelet\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.799965 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-slash\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800001 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-os-release\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800037 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-run-k8s-cni-cncf-io\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800052 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800067 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-tuned\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800069 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28f30ab5-30f5-42ed-881e-4af2ffff85dc-cni-binary-copy\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800089 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-hostroot\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800105 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-multus-conf-dir\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800121 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a841a79-6082-4722-b8d8-9b6626187a1f-ovnkube-config\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800138 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-multus-socket-dir-parent\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.800950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800146 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cc5f4800-7f3a-482a-8059-189a6072df5a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800158 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5xtk\" (UniqueName: \"kubernetes.io/projected/cc5f4800-7f3a-482a-8059-189a6072df5a-kube-api-access-n5xtk\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800160 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-os-release\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800191 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800214 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-registration-dir\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800221 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-run-ovn\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800233 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-hostroot\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800249 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-lib-modules\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800265 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-multus-conf-dir\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800275 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc5f4800-7f3a-482a-8059-189a6072df5a-system-cni-dir\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800299 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/600448d4-6de5-4739-b9f6-281d9613622c-host\") pod \"node-ca-6lc2b\" (UID: \"600448d4-6de5-4739-b9f6-281d9613622c\") " pod="openshift-image-registry/node-ca-6lc2b" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800322 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e736d0c-df1a-43ad-8265-6b52f1993b03-tmp-dir\") pod \"node-resolver-mtx92\" (UID: \"8e736d0c-df1a-43ad-8265-6b52f1993b03\") " pod="openshift-dns/node-resolver-mtx92" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800341 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-multus-socket-dir-parent\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800347 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-device-dir\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800370 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-var-lib-kubelet\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800394 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-run-multus-certs\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800421 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fthkc\" (UniqueName: \"kubernetes.io/projected/28f30ab5-30f5-42ed-881e-4af2ffff85dc-kube-api-access-fthkc\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800447 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-tmp\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.801907 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800448 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fa6065cc-b8a5-43b6-b34c-abd80570d3dc-konnectivity-ca\") pod \"konnectivity-agent-w6l25\" (UID: \"fa6065cc-b8a5-43b6-b34c-abd80570d3dc\") " pod="kube-system/konnectivity-agent-w6l25" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800472 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-etc-kubernetes\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800509 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc5f4800-7f3a-482a-8059-189a6072df5a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800250 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-run-k8s-cni-cncf-io\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800563 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27kfv\" (UniqueName: \"kubernetes.io/projected/8e736d0c-df1a-43ad-8265-6b52f1993b03-kube-api-access-27kfv\") pod \"node-resolver-mtx92\" (UID: \"8e736d0c-df1a-43ad-8265-6b52f1993b03\") " pod="openshift-dns/node-resolver-mtx92" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800566 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc5f4800-7f3a-482a-8059-189a6072df5a-system-cni-dir\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800582 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-etc-selinux\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800597 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-host\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800600 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/600448d4-6de5-4739-b9f6-281d9613622c-host\") pod \"node-ca-6lc2b\" (UID: \"600448d4-6de5-4739-b9f6-281d9613622c\") " pod="openshift-image-registry/node-ca-6lc2b" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800612 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-var-lib-kubelet\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800632 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-run-systemd\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800649 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a841a79-6082-4722-b8d8-9b6626187a1f-ovnkube-script-lib\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800654 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-device-dir\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800666 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf-iptables-alerter-script\") pod \"iptables-alerter-wnkrl\" (UID: \"0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf\") " pod="openshift-network-operator/iptables-alerter-wnkrl" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800682 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-socket-dir\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800697 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-var-lib-kubelet\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800700 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-systemd\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.802764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800396 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-lib-modules\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800756 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-var-lib-kubelet\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800836 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-systemd\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:23.800840 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800877 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-run-multus-certs\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:23.800905 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs podName:34a08e1b-c531-4efa-80dc-60379820b460 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:24.300885217 +0000 UTC m=+3.050327014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs") pod "network-metrics-daemon-dtrg9" (UID: "34a08e1b-c531-4efa-80dc-60379820b460") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800942 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc5f4800-7f3a-482a-8059-189a6072df5a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.800961 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e736d0c-df1a-43ad-8265-6b52f1993b03-tmp-dir\") pod \"node-resolver-mtx92\" (UID: \"8e736d0c-df1a-43ad-8265-6b52f1993b03\") " pod="openshift-dns/node-resolver-mtx92" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.801009 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-host\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.801023 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-var-lib-cni-bin\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.801048 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-etc-kubernetes\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.801123 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc5f4800-7f3a-482a-8059-189a6072df5a-cni-binary-copy\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.801165 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-systemd-units\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.801175 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-etc-selinux\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.801243 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/28f30ab5-30f5-42ed-881e-4af2ffff85dc-host-var-lib-cni-bin\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.801307 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-run-netns\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.801317 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dc9e19bd-f4ca-4fa7-8442-d784513fd798-socket-dir\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.803484 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.801545 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc5f4800-7f3a-482a-8059-189a6072df5a-cni-binary-copy\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.803943 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.801564 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf-iptables-alerter-script\") pod \"iptables-alerter-wnkrl\" (UID: \"0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf\") " pod="openshift-network-operator/iptables-alerter-wnkrl" Apr 23 17:41:23.803943 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.803077 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-etc-tuned\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.803943 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.803255 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fa6065cc-b8a5-43b6-b34c-abd80570d3dc-agent-certs\") pod \"konnectivity-agent-w6l25\" (UID: \"fa6065cc-b8a5-43b6-b34c-abd80570d3dc\") " pod="kube-system/konnectivity-agent-w6l25" Apr 23 17:41:23.803943 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.803387 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-tmp\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.808803 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.808671 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lv2h\" (UniqueName: \"kubernetes.io/projected/4f98091f-8e93-41c5-8955-8d9bbf57c3bf-kube-api-access-2lv2h\") pod \"tuned-ltb6d\" (UID: \"4f98091f-8e93-41c5-8955-8d9bbf57c3bf\") " pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:23.810479 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.810404 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99p4t\" (UniqueName: \"kubernetes.io/projected/34a08e1b-c531-4efa-80dc-60379820b460-kube-api-access-99p4t\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:23.812199 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.811007 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5xtk\" (UniqueName: \"kubernetes.io/projected/cc5f4800-7f3a-482a-8059-189a6072df5a-kube-api-access-n5xtk\") pod \"multus-additional-cni-plugins-xcnjl\" (UID: \"cc5f4800-7f3a-482a-8059-189a6072df5a\") " pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:23.812199 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.811064 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kkwv\" (UniqueName: \"kubernetes.io/projected/dc9e19bd-f4ca-4fa7-8442-d784513fd798-kube-api-access-9kkwv\") pod \"aws-ebs-csi-driver-node-tr4s7\" (UID: \"dc9e19bd-f4ca-4fa7-8442-d784513fd798\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:23.812415 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.812391 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fthkc\" (UniqueName: \"kubernetes.io/projected/28f30ab5-30f5-42ed-881e-4af2ffff85dc-kube-api-access-fthkc\") pod \"multus-9c58k\" (UID: \"28f30ab5-30f5-42ed-881e-4af2ffff85dc\") " pod="openshift-multus/multus-9c58k" Apr 23 17:41:23.814737 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.812601 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v27cq\" (UniqueName: \"kubernetes.io/projected/0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf-kube-api-access-v27cq\") pod \"iptables-alerter-wnkrl\" (UID: \"0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf\") " pod="openshift-network-operator/iptables-alerter-wnkrl" Apr 23 17:41:23.814737 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.812793 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27kfv\" (UniqueName: \"kubernetes.io/projected/8e736d0c-df1a-43ad-8265-6b52f1993b03-kube-api-access-27kfv\") pod \"node-resolver-mtx92\" (UID: \"8e736d0c-df1a-43ad-8265-6b52f1993b03\") " pod="openshift-dns/node-resolver-mtx92" Apr 23 17:41:23.815257 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.815233 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whstv\" (UniqueName: \"kubernetes.io/projected/600448d4-6de5-4739-b9f6-281d9613622c-kube-api-access-whstv\") pod \"node-ca-6lc2b\" (UID: \"600448d4-6de5-4739-b9f6-281d9613622c\") " pod="openshift-image-registry/node-ca-6lc2b" Apr 23 17:41:23.884576 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.884548 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:23.902111 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902081 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-cni-bin\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902256 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902123 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-kubelet\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902256 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902139 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-slash\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902256 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902162 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902256 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902200 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902256 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902200 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-cni-bin\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902256 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902210 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-slash\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902256 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902216 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-kubelet\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902256 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902245 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a841a79-6082-4722-b8d8-9b6626187a1f-ovnkube-config\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902315 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-run-ovn\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902344 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-run-systemd\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902363 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a841a79-6082-4722-b8d8-9b6626187a1f-ovnkube-script-lib\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902380 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-systemd-units\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902403 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-run-netns\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902414 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-run-ovn\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902427 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-var-lib-openvswitch\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902432 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-run-systemd\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902450 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-etc-openvswitch\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902461 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-systemd-units\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902471 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-run-netns\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902482 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-var-lib-openvswitch\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902473 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-log-socket\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902507 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-log-socket\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902516 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-etc-openvswitch\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902521 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-cni-netd\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902544 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-cni-netd\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.902558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902561 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f664q\" (UniqueName: \"kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q\") pod \"network-check-target-9p89h\" (UID: \"93857015-7301-4fa2-b52a-008e134f997b\") " pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:23.903089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902591 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.903089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902619 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-run-openvswitch\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.903089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902643 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a841a79-6082-4722-b8d8-9b6626187a1f-env-overrides\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.903089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902647 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.903089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902668 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a841a79-6082-4722-b8d8-9b6626187a1f-ovn-node-metrics-cert\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.903089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902684 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-run-openvswitch\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.903089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902692 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgcxq\" (UniqueName: \"kubernetes.io/projected/7a841a79-6082-4722-b8d8-9b6626187a1f-kube-api-access-pgcxq\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.903089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902726 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-node-log\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.903089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902799 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a841a79-6082-4722-b8d8-9b6626187a1f-node-log\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.903089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902871 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a841a79-6082-4722-b8d8-9b6626187a1f-ovnkube-config\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.903089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.902963 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a841a79-6082-4722-b8d8-9b6626187a1f-ovnkube-script-lib\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.903089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.903084 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a841a79-6082-4722-b8d8-9b6626187a1f-env-overrides\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.905568 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.905542 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a841a79-6082-4722-b8d8-9b6626187a1f-ovn-node-metrics-cert\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.909885 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:23.909833 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:23.909885 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:23.909858 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:23.909885 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:23.909871 2581 projected.go:194] Error preparing data for projected volume kube-api-access-f664q for pod openshift-network-diagnostics/network-check-target-9p89h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:23.910098 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:23.909929 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q podName:93857015-7301-4fa2-b52a-008e134f997b nodeName:}" failed. No retries permitted until 2026-04-23 17:41:24.409911769 +0000 UTC m=+3.159353561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-f664q" (UniqueName: "kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q") pod "network-check-target-9p89h" (UID: "93857015-7301-4fa2-b52a-008e134f997b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:23.911925 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.911906 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgcxq\" (UniqueName: \"kubernetes.io/projected/7a841a79-6082-4722-b8d8-9b6626187a1f-kube-api-access-pgcxq\") pod \"ovnkube-node-b27s8\" (UID: \"7a841a79-6082-4722-b8d8-9b6626187a1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:23.990053 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.990004 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wnkrl" Apr 23 17:41:23.999771 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:23.999751 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w6l25" Apr 23 17:41:24.007120 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.007100 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mtx92" Apr 23 17:41:24.014642 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.014616 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" Apr 23 17:41:24.020235 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.020208 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" Apr 23 17:41:24.026782 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.026765 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9c58k" Apr 23 17:41:24.034320 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.034302 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xcnjl" Apr 23 17:41:24.047837 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.047821 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6lc2b" Apr 23 17:41:24.054452 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.054429 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:24.305717 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.305625 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:24.305867 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:24.305761 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:24.305867 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:24.305835 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs podName:34a08e1b-c531-4efa-80dc-60379820b460 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:25.305817065 +0000 UTC m=+4.055258868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs") pod "network-metrics-daemon-dtrg9" (UID: "34a08e1b-c531-4efa-80dc-60379820b460") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:24.410108 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:24.410052 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc9e19bd_f4ca_4fa7_8442_d784513fd798.slice/crio-8e699a0c5d422089c6ade1487fbf4f31caa912883bfac83107f49fe0c460dedb WatchSource:0}: Error finding container 8e699a0c5d422089c6ade1487fbf4f31caa912883bfac83107f49fe0c460dedb: Status 404 returned error can't find the container with id 8e699a0c5d422089c6ade1487fbf4f31caa912883bfac83107f49fe0c460dedb Apr 23 17:41:24.411446 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:24.411398 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a841a79_6082_4722_b8d8_9b6626187a1f.slice/crio-f05b8adac5a73c639392cf6760b041d05907f106d20aaef9f9a909c75fb775c2 WatchSource:0}: Error finding container f05b8adac5a73c639392cf6760b041d05907f106d20aaef9f9a909c75fb775c2: Status 404 returned error can't find the container with id f05b8adac5a73c639392cf6760b041d05907f106d20aaef9f9a909c75fb775c2 Apr 23 17:41:24.415179 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:24.415158 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa6065cc_b8a5_43b6_b34c_abd80570d3dc.slice/crio-163e1d9b019ce886f2b656e5391c5da44b822cd4a4b3d9d21d0025c412391841 WatchSource:0}: Error finding container 163e1d9b019ce886f2b656e5391c5da44b822cd4a4b3d9d21d0025c412391841: Status 404 returned error can't find the container with id 163e1d9b019ce886f2b656e5391c5da44b822cd4a4b3d9d21d0025c412391841 Apr 23 17:41:24.416138 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:24.416112 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod600448d4_6de5_4739_b9f6_281d9613622c.slice/crio-c7c06f1fb795fb0956042efe68efcf33dda13cbd7af455e4bf1a805de38782e9 WatchSource:0}: Error finding container c7c06f1fb795fb0956042efe68efcf33dda13cbd7af455e4bf1a805de38782e9: Status 404 returned error can't find the container with id c7c06f1fb795fb0956042efe68efcf33dda13cbd7af455e4bf1a805de38782e9 Apr 23 17:41:24.416868 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:24.416800 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e736d0c_df1a_43ad_8265_6b52f1993b03.slice/crio-9ccf60c83e4f14c1c7050a82a327cb2984256d8516fc55017bf91e8f4351458a WatchSource:0}: Error finding container 9ccf60c83e4f14c1c7050a82a327cb2984256d8516fc55017bf91e8f4351458a: Status 404 returned error can't find the container with id 9ccf60c83e4f14c1c7050a82a327cb2984256d8516fc55017bf91e8f4351458a Apr 23 17:41:24.420931 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:41:24.420908 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c4d96e3_f6a9_46d8_b34c_7b464f1e6ebf.slice/crio-9e777952a1360dee7a17945b71197d9c246dc236c58f2bef92a2a36fd5255433 WatchSource:0}: Error finding container 9e777952a1360dee7a17945b71197d9c246dc236c58f2bef92a2a36fd5255433: Status 404 returned error can't find the container with id 9e777952a1360dee7a17945b71197d9c246dc236c58f2bef92a2a36fd5255433 Apr 23 17:41:24.507771 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.507745 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f664q\" (UniqueName: \"kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q\") pod \"network-check-target-9p89h\" (UID: \"93857015-7301-4fa2-b52a-008e134f997b\") " pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:24.507900 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:24.507879 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:24.508009 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:24.507906 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:24.508009 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:24.507919 2581 projected.go:194] Error preparing data for projected volume kube-api-access-f664q for pod openshift-network-diagnostics/network-check-target-9p89h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:24.508125 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:24.507975 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q podName:93857015-7301-4fa2-b52a-008e134f997b nodeName:}" failed. No retries permitted until 2026-04-23 17:41:25.507956544 +0000 UTC m=+4.257398337 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-f664q" (UniqueName: "kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q") pod "network-check-target-9p89h" (UID: "93857015-7301-4fa2-b52a-008e134f997b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:24.735434 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.735342 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:36:22 +0000 UTC" deadline="2028-02-01 22:52:53.656502022 +0000 UTC" Apr 23 17:41:24.735434 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.735381 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15581h11m28.921124988s" Apr 23 17:41:24.800510 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.800151 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-218.ec2.internal" event={"ID":"0686775b0acf60aa77b819482efe0489","Type":"ContainerStarted","Data":"28bbbf30bd1839efef42a73ae913c14ed43b49bf4ee6af2267e8bab60b83bce6"} Apr 23 17:41:24.804192 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.804110 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9c58k" event={"ID":"28f30ab5-30f5-42ed-881e-4af2ffff85dc","Type":"ContainerStarted","Data":"588496d207f1d55eeaac7f260ddc53fdbb59cd5056030d2ae6ae45761ba79c6f"} Apr 23 17:41:24.806399 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.806367 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w6l25" event={"ID":"fa6065cc-b8a5-43b6-b34c-abd80570d3dc","Type":"ContainerStarted","Data":"163e1d9b019ce886f2b656e5391c5da44b822cd4a4b3d9d21d0025c412391841"} Apr 23 17:41:24.809476 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.809447 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" event={"ID":"4f98091f-8e93-41c5-8955-8d9bbf57c3bf","Type":"ContainerStarted","Data":"16307d64df350d97d570076bbb5b3cfe5739e550d8a77a5b84f822c7a21f6bee"} Apr 23 17:41:24.811240 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.811215 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wnkrl" event={"ID":"0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf","Type":"ContainerStarted","Data":"9e777952a1360dee7a17945b71197d9c246dc236c58f2bef92a2a36fd5255433"} Apr 23 17:41:24.818284 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.818254 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xcnjl" event={"ID":"cc5f4800-7f3a-482a-8059-189a6072df5a","Type":"ContainerStarted","Data":"926198ed3e4a4f926e0a20ca3d1a13e0a252362eaae9c749858965db9f493eca"} Apr 23 17:41:24.818572 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.818412 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-218.ec2.internal" podStartSLOduration=1.8183989390000002 podStartE2EDuration="1.818398939s" podCreationTimestamp="2026-04-23 17:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:41:24.817397739 +0000 UTC m=+3.566839543" watchObservedRunningTime="2026-04-23 17:41:24.818398939 +0000 UTC m=+3.567840739" Apr 23 17:41:24.832312 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.832260 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mtx92" event={"ID":"8e736d0c-df1a-43ad-8265-6b52f1993b03","Type":"ContainerStarted","Data":"9ccf60c83e4f14c1c7050a82a327cb2984256d8516fc55017bf91e8f4351458a"} Apr 23 17:41:24.844615 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.844506 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6lc2b" event={"ID":"600448d4-6de5-4739-b9f6-281d9613622c","Type":"ContainerStarted","Data":"c7c06f1fb795fb0956042efe68efcf33dda13cbd7af455e4bf1a805de38782e9"} Apr 23 17:41:24.850735 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.850697 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" event={"ID":"7a841a79-6082-4722-b8d8-9b6626187a1f","Type":"ContainerStarted","Data":"f05b8adac5a73c639392cf6760b041d05907f106d20aaef9f9a909c75fb775c2"} Apr 23 17:41:24.854939 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:24.854900 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" event={"ID":"dc9e19bd-f4ca-4fa7-8442-d784513fd798","Type":"ContainerStarted","Data":"8e699a0c5d422089c6ade1487fbf4f31caa912883bfac83107f49fe0c460dedb"} Apr 23 17:41:25.314520 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:25.314485 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:25.314681 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:25.314663 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:25.314752 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:25.314729 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs podName:34a08e1b-c531-4efa-80dc-60379820b460 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:27.314710293 +0000 UTC m=+6.064152085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs") pod "network-metrics-daemon-dtrg9" (UID: "34a08e1b-c531-4efa-80dc-60379820b460") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:25.515586 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:25.515501 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f664q\" (UniqueName: \"kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q\") pod \"network-check-target-9p89h\" (UID: \"93857015-7301-4fa2-b52a-008e134f997b\") " pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:25.515745 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:25.515705 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:25.515745 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:25.515726 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:25.515745 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:25.515738 2581 projected.go:194] Error preparing data for projected volume kube-api-access-f664q for pod openshift-network-diagnostics/network-check-target-9p89h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:25.515897 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:25.515795 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q podName:93857015-7301-4fa2-b52a-008e134f997b nodeName:}" failed. No retries permitted until 2026-04-23 17:41:27.515777199 +0000 UTC m=+6.265218993 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-f664q" (UniqueName: "kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q") pod "network-check-target-9p89h" (UID: "93857015-7301-4fa2-b52a-008e134f997b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:25.788852 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:25.788373 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:25.788852 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:25.788471 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:25.788852 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:25.788514 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:25.788852 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:25.788610 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:25.900421 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:25.900382 2581 generic.go:358] "Generic (PLEG): container finished" podID="4ad0c906f995d60a9073fbd94787bfd9" containerID="fb5a009b85684628fe0bb08686af08e5220b22ffaad25f4b772dea4e4884341b" exitCode=0 Apr 23 17:41:25.901363 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:25.901337 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal" event={"ID":"4ad0c906f995d60a9073fbd94787bfd9","Type":"ContainerDied","Data":"fb5a009b85684628fe0bb08686af08e5220b22ffaad25f4b772dea4e4884341b"} Apr 23 17:41:26.907009 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:26.906421 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal" event={"ID":"4ad0c906f995d60a9073fbd94787bfd9","Type":"ContainerStarted","Data":"ac2138c5ac00cb888681fc4761168d167096c05638c5c848ad39d398f414ad8f"} Apr 23 17:41:27.332040 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:27.331248 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:27.332040 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:27.331551 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:27.332040 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:27.331635 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs podName:34a08e1b-c531-4efa-80dc-60379820b460 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:31.33161569 +0000 UTC m=+10.081057475 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs") pod "network-metrics-daemon-dtrg9" (UID: "34a08e1b-c531-4efa-80dc-60379820b460") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:27.533627 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:27.533546 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f664q\" (UniqueName: \"kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q\") pod \"network-check-target-9p89h\" (UID: \"93857015-7301-4fa2-b52a-008e134f997b\") " pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:27.533780 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:27.533762 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:27.533817 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:27.533790 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:27.533817 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:27.533804 2581 projected.go:194] Error preparing data for projected volume kube-api-access-f664q for pod openshift-network-diagnostics/network-check-target-9p89h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:27.533883 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:27.533866 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q podName:93857015-7301-4fa2-b52a-008e134f997b nodeName:}" failed. No retries permitted until 2026-04-23 17:41:31.533846178 +0000 UTC m=+10.283287982 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-f664q" (UniqueName: "kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q") pod "network-check-target-9p89h" (UID: "93857015-7301-4fa2-b52a-008e134f997b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:27.786674 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:27.786431 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:27.786674 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:27.786543 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:27.787017 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:27.786960 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:27.787124 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:27.787101 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:29.788405 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:29.787675 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:29.788405 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:29.787817 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:29.788405 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:29.788279 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:29.788405 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:29.788362 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:31.368928 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:31.368435 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:31.368928 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:31.368566 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:31.368928 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:31.368643 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs podName:34a08e1b-c531-4efa-80dc-60379820b460 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:39.368625967 +0000 UTC m=+18.118067746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs") pod "network-metrics-daemon-dtrg9" (UID: "34a08e1b-c531-4efa-80dc-60379820b460") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:31.570502 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:31.570471 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f664q\" (UniqueName: \"kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q\") pod \"network-check-target-9p89h\" (UID: \"93857015-7301-4fa2-b52a-008e134f997b\") " pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:31.570691 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:31.570624 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:31.570691 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:31.570644 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:31.570691 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:31.570655 2581 projected.go:194] Error preparing data for projected volume kube-api-access-f664q for pod openshift-network-diagnostics/network-check-target-9p89h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:31.570851 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:31.570712 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q podName:93857015-7301-4fa2-b52a-008e134f997b nodeName:}" failed. No retries permitted until 2026-04-23 17:41:39.570690998 +0000 UTC m=+18.320132784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-f664q" (UniqueName: "kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q") pod "network-check-target-9p89h" (UID: "93857015-7301-4fa2-b52a-008e134f997b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:31.787845 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:31.787114 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:31.788623 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:31.788232 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:31.788623 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:31.788332 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:31.788623 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:31.788584 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:33.785946 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:33.785920 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:33.786311 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:33.786069 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:33.786311 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:33.785923 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:33.786311 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:33.786304 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:35.789186 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:35.789151 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:35.789596 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:35.789164 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:35.789596 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:35.789262 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:35.789596 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:35.789376 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:37.789096 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:37.789058 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:37.789497 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:37.789058 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:37.789497 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:37.789190 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:37.789497 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:37.789219 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:39.426972 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:39.426933 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:39.427422 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:39.427106 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:39.427422 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:39.427182 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs podName:34a08e1b-c531-4efa-80dc-60379820b460 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:55.427161061 +0000 UTC m=+34.176602840 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs") pod "network-metrics-daemon-dtrg9" (UID: "34a08e1b-c531-4efa-80dc-60379820b460") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:39.628343 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:39.628304 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f664q\" (UniqueName: \"kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q\") pod \"network-check-target-9p89h\" (UID: \"93857015-7301-4fa2-b52a-008e134f997b\") " pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:39.628591 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:39.628443 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:39.628591 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:39.628467 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:39.628591 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:39.628482 2581 projected.go:194] Error preparing data for projected volume kube-api-access-f664q for pod openshift-network-diagnostics/network-check-target-9p89h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:39.628591 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:39.628544 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q podName:93857015-7301-4fa2-b52a-008e134f997b nodeName:}" failed. No retries permitted until 2026-04-23 17:41:55.628529268 +0000 UTC m=+34.377971053 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-f664q" (UniqueName: "kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q") pod "network-check-target-9p89h" (UID: "93857015-7301-4fa2-b52a-008e134f997b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:39.786654 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:39.786582 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:39.786800 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:39.786721 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:39.786800 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:39.786784 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:39.786931 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:39.786876 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:41.789668 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:41.789643 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:41.790114 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:41.789711 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:41.790114 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:41.789804 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:41.790114 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:41.789938 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:41.931789 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:41.931742 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xcnjl" event={"ID":"cc5f4800-7f3a-482a-8059-189a6072df5a","Type":"ContainerStarted","Data":"56e879a24387a3db0d878d1f5cb451c67e73b213aae2029845b6283cefd45e65"} Apr 23 17:41:41.933166 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:41.933133 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mtx92" event={"ID":"8e736d0c-df1a-43ad-8265-6b52f1993b03","Type":"ContainerStarted","Data":"8e0eb4eca58eb98be08bfc5d4adf5bdcd7e28a731b0e52b3e66f1eb117e4e685"} Apr 23 17:41:41.934494 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:41.934464 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6lc2b" event={"ID":"600448d4-6de5-4739-b9f6-281d9613622c","Type":"ContainerStarted","Data":"995db5dffdcb47caa2c0d9cec3b2cb20a4f8b760a4ca38f39536b4572fd7fbfe"} Apr 23 17:41:41.935850 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:41.935828 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" event={"ID":"dc9e19bd-f4ca-4fa7-8442-d784513fd798","Type":"ContainerStarted","Data":"d17342afa29b0dd2f3130e1ef2528bc23c8fc97fdd2c3c986a4b12096efd3b1e"} Apr 23 17:41:41.937177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:41.937153 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9c58k" event={"ID":"28f30ab5-30f5-42ed-881e-4af2ffff85dc","Type":"ContainerStarted","Data":"42bd97d8bcd27757166e17434e78537c1bc3d2a93b0005051cdecddb5f72ff3a"} Apr 23 17:41:41.938381 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:41.938356 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w6l25" event={"ID":"fa6065cc-b8a5-43b6-b34c-abd80570d3dc","Type":"ContainerStarted","Data":"7462288cd5451089dee579a5d2a91b544043eab6a221dc688aae3a1717993428"} Apr 23 17:41:41.939622 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:41.939593 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" event={"ID":"4f98091f-8e93-41c5-8955-8d9bbf57c3bf","Type":"ContainerStarted","Data":"eb7019ae9203cb8b2a938a982d01540cc0b1b087b94ab9c35e0e6044b1ca7c4b"} Apr 23 17:41:41.953225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:41.953189 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-218.ec2.internal" podStartSLOduration=18.953178119 podStartE2EDuration="18.953178119s" podCreationTimestamp="2026-04-23 17:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:41:26.924568652 +0000 UTC m=+5.674010453" watchObservedRunningTime="2026-04-23 17:41:41.953178119 +0000 UTC m=+20.702619920" Apr 23 17:41:41.965412 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:41.965375 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6lc2b" podStartSLOduration=2.884030615 podStartE2EDuration="19.965361212s" podCreationTimestamp="2026-04-23 17:41:22 +0000 UTC" firstStartedPulling="2026-04-23 17:41:24.418273032 +0000 UTC m=+3.167714810" lastFinishedPulling="2026-04-23 17:41:41.499603621 +0000 UTC m=+20.249045407" observedRunningTime="2026-04-23 17:41:41.965272172 +0000 UTC m=+20.714713973" watchObservedRunningTime="2026-04-23 17:41:41.965361212 +0000 UTC m=+20.714803013" Apr 23 17:41:41.992297 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:41.992208 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-w6l25" podStartSLOduration=3.910920928 podStartE2EDuration="20.992189892s" podCreationTimestamp="2026-04-23 17:41:21 +0000 UTC" firstStartedPulling="2026-04-23 17:41:24.416650477 +0000 UTC m=+3.166092257" lastFinishedPulling="2026-04-23 17:41:41.497919438 +0000 UTC m=+20.247361221" observedRunningTime="2026-04-23 17:41:41.979076314 +0000 UTC m=+20.728518114" watchObservedRunningTime="2026-04-23 17:41:41.992189892 +0000 UTC m=+20.741631693" Apr 23 17:41:41.992565 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:41.992528 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mtx92" podStartSLOduration=3.91396431 podStartE2EDuration="20.992518189s" podCreationTimestamp="2026-04-23 17:41:21 +0000 UTC" firstStartedPulling="2026-04-23 17:41:24.419497764 +0000 UTC m=+3.168939560" lastFinishedPulling="2026-04-23 17:41:41.498051657 +0000 UTC m=+20.247493439" observedRunningTime="2026-04-23 17:41:41.991838019 +0000 UTC m=+20.741279951" watchObservedRunningTime="2026-04-23 17:41:41.992518189 +0000 UTC m=+20.741959990" Apr 23 17:41:42.010305 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:42.010257 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9c58k" podStartSLOduration=2.647131123 podStartE2EDuration="20.010244525s" podCreationTimestamp="2026-04-23 17:41:22 +0000 UTC" firstStartedPulling="2026-04-23 17:41:24.425360053 +0000 UTC m=+3.174801848" lastFinishedPulling="2026-04-23 17:41:41.788473459 +0000 UTC m=+20.537915250" observedRunningTime="2026-04-23 17:41:42.009978443 +0000 UTC m=+20.759420245" watchObservedRunningTime="2026-04-23 17:41:42.010244525 +0000 UTC m=+20.759686325" Apr 23 17:41:42.027792 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:42.027386 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ltb6d" podStartSLOduration=2.954802722 podStartE2EDuration="20.027368181s" podCreationTimestamp="2026-04-23 17:41:22 +0000 UTC" firstStartedPulling="2026-04-23 17:41:24.425321713 +0000 UTC m=+3.174763493" lastFinishedPulling="2026-04-23 17:41:41.497887159 +0000 UTC m=+20.247328952" observedRunningTime="2026-04-23 17:41:42.027134433 +0000 UTC m=+20.776576234" watchObservedRunningTime="2026-04-23 17:41:42.027368181 +0000 UTC m=+20.776809983" Apr 23 17:41:42.944397 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:42.943931 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" event={"ID":"7a841a79-6082-4722-b8d8-9b6626187a1f","Type":"ContainerStarted","Data":"a7a5fc288abb55fdc89f9ff4e829943692381dc852d43e6e8874c0f786676411"} Apr 23 17:41:42.945081 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:42.944416 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" event={"ID":"7a841a79-6082-4722-b8d8-9b6626187a1f","Type":"ContainerStarted","Data":"52e7207bdf93b5d5c8d961fa63a8153ae0a7228a7a78c730dce2cbe300686b09"} Apr 23 17:41:42.945081 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:42.944439 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" event={"ID":"7a841a79-6082-4722-b8d8-9b6626187a1f","Type":"ContainerStarted","Data":"3b05a915d3168676dc2b34dd7fb2f710cf52230e1ab4775dcd47d7449288f212"} Apr 23 17:41:42.945458 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:42.945430 2581 generic.go:358] "Generic (PLEG): container finished" podID="cc5f4800-7f3a-482a-8059-189a6072df5a" containerID="56e879a24387a3db0d878d1f5cb451c67e73b213aae2029845b6283cefd45e65" exitCode=0 Apr 23 17:41:42.945613 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:42.945576 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xcnjl" event={"ID":"cc5f4800-7f3a-482a-8059-189a6072df5a","Type":"ContainerDied","Data":"56e879a24387a3db0d878d1f5cb451c67e73b213aae2029845b6283cefd45e65"} Apr 23 17:41:42.966271 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:42.966251 2581 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 17:41:43.763651 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:43.763554 2581 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T17:41:42.966266885Z","UUID":"741a7deb-2e97-4ff2-b6c0-76d3e781e1be","Handler":null,"Name":"","Endpoint":""} Apr 23 17:41:43.767189 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:43.767158 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 17:41:43.767189 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:43.767189 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 17:41:43.787916 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:43.787887 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:43.788055 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:43.788001 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:43.788275 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:43.788250 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:43.788357 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:43.788336 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:43.951452 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:43.951420 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" event={"ID":"7a841a79-6082-4722-b8d8-9b6626187a1f","Type":"ContainerStarted","Data":"d0e2f44866861c8fc30b840f03da7f3260e70d44d70b75d066daa8a3663694e8"} Apr 23 17:41:43.951819 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:43.951462 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" event={"ID":"7a841a79-6082-4722-b8d8-9b6626187a1f","Type":"ContainerStarted","Data":"42569fe847c39b044c3ae9de28ad0eab94dd4194349399d74899e0398247ae44"} Apr 23 17:41:43.951819 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:43.951475 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" event={"ID":"7a841a79-6082-4722-b8d8-9b6626187a1f","Type":"ContainerStarted","Data":"0289a0cb93a7b06e3ae540166e91e815d3e6721ec63e153560ed43bdb13571b7"} Apr 23 17:41:43.953341 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:43.953312 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" event={"ID":"dc9e19bd-f4ca-4fa7-8442-d784513fd798","Type":"ContainerStarted","Data":"de3250d5aa8bf1ddbde0d526004d825d88a06e3c1542239f94261ebd0c35211f"} Apr 23 17:41:43.954750 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:43.954725 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wnkrl" event={"ID":"0c4d96e3-f6a9-46d8-b34c-7b464f1e6ebf","Type":"ContainerStarted","Data":"16c85fe44eae29f698a8d85bc2c34160ab9557e770eec8698244372c4ae330da"} Apr 23 17:41:43.975723 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:43.975683 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wnkrl" podStartSLOduration=5.903004306 podStartE2EDuration="22.975671352s" podCreationTimestamp="2026-04-23 17:41:21 +0000 UTC" firstStartedPulling="2026-04-23 17:41:24.425310007 +0000 UTC m=+3.174751790" lastFinishedPulling="2026-04-23 17:41:41.497977056 +0000 UTC m=+20.247418836" observedRunningTime="2026-04-23 17:41:43.975475829 +0000 UTC m=+22.724917630" watchObservedRunningTime="2026-04-23 17:41:43.975671352 +0000 UTC m=+22.725113152" Apr 23 17:41:44.958642 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:44.958599 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" event={"ID":"dc9e19bd-f4ca-4fa7-8442-d784513fd798","Type":"ContainerStarted","Data":"b2cebf4ab38822a13d0b96791b75c439501c20c3f78ac413363bccca6e9bdfba"} Apr 23 17:41:45.412313 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:45.412276 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-w6l25" Apr 23 17:41:45.412847 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:45.412828 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-w6l25" Apr 23 17:41:45.429139 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:45.429086 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tr4s7" podStartSLOduration=4.890018596 podStartE2EDuration="24.429073825s" podCreationTimestamp="2026-04-23 17:41:21 +0000 UTC" firstStartedPulling="2026-04-23 17:41:24.411700766 +0000 UTC m=+3.161142558" lastFinishedPulling="2026-04-23 17:41:43.950755991 +0000 UTC m=+22.700197787" observedRunningTime="2026-04-23 17:41:44.97765503 +0000 UTC m=+23.727096843" watchObservedRunningTime="2026-04-23 17:41:45.429073825 +0000 UTC m=+24.178515680" Apr 23 17:41:45.786471 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:45.786437 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:45.786471 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:45.786456 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:45.786687 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:45.786575 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:45.786741 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:45.786678 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:45.963548 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:45.963511 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" event={"ID":"7a841a79-6082-4722-b8d8-9b6626187a1f","Type":"ContainerStarted","Data":"28560c6ac97cfeb72718bd50abe217697649b23bfc30c68a4d5197ed1ad4ac61"} Apr 23 17:41:45.964088 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:45.963909 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-w6l25" Apr 23 17:41:45.964316 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:45.964301 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-w6l25" Apr 23 17:41:47.786172 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:47.785912 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:47.786812 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:47.785948 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:47.786812 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:47.786199 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:47.786812 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:47.786286 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:47.968600 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:47.968560 2581 generic.go:358] "Generic (PLEG): container finished" podID="cc5f4800-7f3a-482a-8059-189a6072df5a" containerID="88abfe5bfa6854668d5612ca4046a16212b1d49cf0828ceb8881852a4f749c88" exitCode=0 Apr 23 17:41:47.968764 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:47.968636 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xcnjl" event={"ID":"cc5f4800-7f3a-482a-8059-189a6072df5a","Type":"ContainerDied","Data":"88abfe5bfa6854668d5612ca4046a16212b1d49cf0828ceb8881852a4f749c88"} Apr 23 17:41:47.972043 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:47.972018 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" event={"ID":"7a841a79-6082-4722-b8d8-9b6626187a1f","Type":"ContainerStarted","Data":"18860345da4fd4961e3cdb2440a18d94436c09618e033125d3cbff7e9093f3fd"} Apr 23 17:41:48.024113 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:48.024068 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" podStartSLOduration=8.128337713 podStartE2EDuration="26.024054533s" podCreationTimestamp="2026-04-23 17:41:22 +0000 UTC" firstStartedPulling="2026-04-23 17:41:24.414565191 +0000 UTC m=+3.164006970" lastFinishedPulling="2026-04-23 17:41:42.310282008 +0000 UTC m=+21.059723790" observedRunningTime="2026-04-23 17:41:48.023664899 +0000 UTC m=+26.773106723" watchObservedRunningTime="2026-04-23 17:41:48.024054533 +0000 UTC m=+26.773496335" Apr 23 17:41:48.976482 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:48.976032 2581 generic.go:358] "Generic (PLEG): container finished" podID="cc5f4800-7f3a-482a-8059-189a6072df5a" containerID="206d9b330bc1539ec1573c3f04ce14efa4270fd5e277d4804b26bdbdd5d6abf0" exitCode=0 Apr 23 17:41:48.976482 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:48.976109 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xcnjl" event={"ID":"cc5f4800-7f3a-482a-8059-189a6072df5a","Type":"ContainerDied","Data":"206d9b330bc1539ec1573c3f04ce14efa4270fd5e277d4804b26bdbdd5d6abf0"} Apr 23 17:41:48.977133 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:48.976830 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:48.977133 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:48.976856 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:48.977133 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:48.976869 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:48.993369 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:48.993345 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:48.993532 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:48.993512 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:41:49.261643 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:49.261539 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9p89h"] Apr 23 17:41:49.261774 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:49.261657 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:49.261774 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:49.261745 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:49.264221 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:49.264198 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dtrg9"] Apr 23 17:41:49.264316 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:49.264299 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:49.264398 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:49.264381 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:49.979958 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:49.979866 2581 generic.go:358] "Generic (PLEG): container finished" podID="cc5f4800-7f3a-482a-8059-189a6072df5a" containerID="24a34537ada400c644471b55da3efaf3bf60666fa0d931d55300a7cb5c64f15c" exitCode=0 Apr 23 17:41:49.979958 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:49.979938 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xcnjl" event={"ID":"cc5f4800-7f3a-482a-8059-189a6072df5a","Type":"ContainerDied","Data":"24a34537ada400c644471b55da3efaf3bf60666fa0d931d55300a7cb5c64f15c"} Apr 23 17:41:50.786370 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:50.786334 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:50.786589 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:50.786337 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:50.786589 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:50.786444 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:50.786589 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:50.786548 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:52.786401 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:52.785944 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:52.786401 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:52.786079 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:52.786401 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:52.786147 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:52.786401 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:52.786230 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:54.786167 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:54.786131 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:54.786616 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:54.786253 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9p89h" podUID="93857015-7301-4fa2-b52a-008e134f997b" Apr 23 17:41:54.786616 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:54.786297 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:54.786616 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:54.786393 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:41:55.442171 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.442093 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:55.442290 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.442228 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:55.442290 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.442284 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs podName:34a08e1b-c531-4efa-80dc-60379820b460 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:27.442269547 +0000 UTC m=+66.191711331 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs") pod "network-metrics-daemon-dtrg9" (UID: "34a08e1b-c531-4efa-80dc-60379820b460") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:55.628688 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.628660 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-218.ec2.internal" event="NodeReady" Apr 23 17:41:55.628825 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.628777 2581 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 17:41:55.643163 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.643140 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f664q\" (UniqueName: \"kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q\") pod \"network-check-target-9p89h\" (UID: \"93857015-7301-4fa2-b52a-008e134f997b\") " pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:55.643270 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.643249 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:55.643270 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.643261 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:55.643270 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.643270 2581 projected.go:194] Error preparing data for projected volume kube-api-access-f664q for pod openshift-network-diagnostics/network-check-target-9p89h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:55.643428 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.643310 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q podName:93857015-7301-4fa2-b52a-008e134f997b nodeName:}" failed. No retries permitted until 2026-04-23 17:42:27.643298161 +0000 UTC m=+66.392739939 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-f664q" (UniqueName: "kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q") pod "network-check-target-9p89h" (UID: "93857015-7301-4fa2-b52a-008e134f997b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:55.666675 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.666646 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-747847cfd6-g5xmr"] Apr 23 17:41:55.683708 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.683684 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mndcn"] Apr 23 17:41:55.683862 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.683827 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.686493 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.686474 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 17:41:55.686680 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.686658 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 17:41:55.686779 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.686704 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9m6lq\"" Apr 23 17:41:55.686956 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.686935 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 17:41:55.698430 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.696503 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mndcn"] Apr 23 17:41:55.698430 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.696931 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lg22b"] Apr 23 17:41:55.698430 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.696977 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:41:55.698430 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.697508 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 17:41:55.701102 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.701083 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 17:41:55.701208 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.701172 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-wmv2k\"" Apr 23 17:41:55.701701 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.701682 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 17:41:55.721056 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.721039 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-747847cfd6-g5xmr"] Apr 23 17:41:55.721056 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.721060 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jw7bs"] Apr 23 17:41:55.721207 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.721171 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:55.724168 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.724140 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 17:41:55.724168 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.724159 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zw6qq\"" Apr 23 17:41:55.724286 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.724142 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 17:41:55.739120 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.739102 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lg22b"] Apr 23 17:41:55.739120 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.739122 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jw7bs"] Apr 23 17:41:55.739210 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.739202 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:41:55.742026 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.741975 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 17:41:55.742110 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.742047 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-576jd\"" Apr 23 17:41:55.742110 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.742053 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 17:41:55.742110 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.742077 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 17:41:55.844713 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.844681 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a8560822-ef3b-47a7-b587-297593afaeeb-image-registry-private-configuration\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.845041 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.844730 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8560822-ef3b-47a7-b587-297593afaeeb-installation-pull-secrets\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.845041 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.844789 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:55.845041 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.844814 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:41:55.845041 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.844834 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:41:55.845041 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.844895 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.845041 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.844923 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8560822-ef3b-47a7-b587-297593afaeeb-trusted-ca\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.845041 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.844947 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3160553d-dd23-411d-afee-16ff4b1a7ca7-tmp-dir\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:55.845041 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.844972 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxsqt\" (UniqueName: \"kubernetes.io/projected/3160553d-dd23-411d-afee-16ff4b1a7ca7-kube-api-access-pxsqt\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:55.845041 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.845037 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8560822-ef3b-47a7-b587-297593afaeeb-registry-certificates\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.845340 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.845053 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5f8z\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-kube-api-access-l5f8z\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.845340 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.845074 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3160553d-dd23-411d-afee-16ff4b1a7ca7-config-volume\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:55.845340 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.845090 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8560822-ef3b-47a7-b587-297593afaeeb-ca-trust-extracted\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.845340 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.845103 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-bound-sa-token\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.845340 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.845119 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:41:55.845340 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.845136 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-546sz\" (UniqueName: \"kubernetes.io/projected/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-kube-api-access-546sz\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:41:55.945841 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.945816 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.946014 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.945849 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8560822-ef3b-47a7-b587-297593afaeeb-trusted-ca\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.946014 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.945875 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3160553d-dd23-411d-afee-16ff4b1a7ca7-tmp-dir\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:55.946014 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.945896 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxsqt\" (UniqueName: \"kubernetes.io/projected/3160553d-dd23-411d-afee-16ff4b1a7ca7-kube-api-access-pxsqt\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:55.946014 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.945921 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8560822-ef3b-47a7-b587-297593afaeeb-registry-certificates\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.946014 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.945938 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5f8z\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-kube-api-access-l5f8z\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.946014 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.945961 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3160553d-dd23-411d-afee-16ff4b1a7ca7-config-volume\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:55.946014 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.945999 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:41:55.946370 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.946028 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747847cfd6-g5xmr: secret "image-registry-tls" not found Apr 23 17:41:55.946370 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.946078 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls podName:a8560822-ef3b-47a7-b587-297593afaeeb nodeName:}" failed. No retries permitted until 2026-04-23 17:41:56.446062823 +0000 UTC m=+35.195504603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls") pod "image-registry-747847cfd6-g5xmr" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb") : secret "image-registry-tls" not found Apr 23 17:41:55.946370 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.946009 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8560822-ef3b-47a7-b587-297593afaeeb-ca-trust-extracted\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.946370 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.946119 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-bound-sa-token\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.946370 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.946146 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:41:55.946370 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.946167 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-546sz\" (UniqueName: \"kubernetes.io/projected/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-kube-api-access-546sz\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:41:55.946370 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.946204 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a8560822-ef3b-47a7-b587-297593afaeeb-image-registry-private-configuration\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.946370 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.946226 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8560822-ef3b-47a7-b587-297593afaeeb-installation-pull-secrets\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.946370 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.946254 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:55.946370 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.946278 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:41:55.946370 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.946298 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:41:55.946878 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.946451 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:55.946878 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.946496 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert podName:e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:56.44648152 +0000 UTC m=+35.195923302 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert") pod "ingress-canary-jw7bs" (UID: "e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9") : secret "canary-serving-cert" not found Apr 23 17:41:55.946878 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.946588 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8560822-ef3b-47a7-b587-297593afaeeb-ca-trust-extracted\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.946878 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.946589 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8560822-ef3b-47a7-b587-297593afaeeb-registry-certificates\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.946878 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.946662 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3160553d-dd23-411d-afee-16ff4b1a7ca7-tmp-dir\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:55.946878 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.946875 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8560822-ef3b-47a7-b587-297593afaeeb-trusted-ca\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.947488 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.947181 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3160553d-dd23-411d-afee-16ff4b1a7ca7-config-volume\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:55.947488 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.947241 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:55.947488 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.947353 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls podName:3160553d-dd23-411d-afee-16ff4b1a7ca7 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:56.447333502 +0000 UTC m=+35.196775281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls") pod "dns-default-lg22b" (UID: "3160553d-dd23-411d-afee-16ff4b1a7ca7") : secret "dns-default-metrics-tls" not found Apr 23 17:41:55.947488 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.947358 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 17:41:55.947488 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:55.947421 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert podName:4c1413a9-cb8c-4052-b6e8-03a3e48b07e4 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:56.447404249 +0000 UTC m=+35.196846050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mndcn" (UID: "4c1413a9-cb8c-4052-b6e8-03a3e48b07e4") : secret "networking-console-plugin-cert" not found Apr 23 17:41:55.948494 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.948428 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:41:55.951256 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.951232 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a8560822-ef3b-47a7-b587-297593afaeeb-image-registry-private-configuration\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.951345 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.951233 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8560822-ef3b-47a7-b587-297593afaeeb-installation-pull-secrets\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.955598 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.955576 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxsqt\" (UniqueName: \"kubernetes.io/projected/3160553d-dd23-411d-afee-16ff4b1a7ca7-kube-api-access-pxsqt\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:55.956461 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.956430 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-bound-sa-token\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.956461 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.956446 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-546sz\" (UniqueName: \"kubernetes.io/projected/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-kube-api-access-546sz\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:41:55.956593 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.956552 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5f8z\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-kube-api-access-l5f8z\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:55.993898 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:55.993871 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xcnjl" event={"ID":"cc5f4800-7f3a-482a-8059-189a6072df5a","Type":"ContainerStarted","Data":"bf7153e51f564831711cbb55ea6ed45bd9be159825d846ef5fc776810ee98916"} Apr 23 17:41:56.450535 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:56.450501 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:41:56.450676 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:56.450561 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:56.450676 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:56.450593 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:41:56.450676 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:56.450637 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:56.450676 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:56.450656 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 17:41:56.450829 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:56.450723 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:56.450829 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:56.450732 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:56.450829 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:56.450748 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:41:56.450829 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:56.450760 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747847cfd6-g5xmr: secret "image-registry-tls" not found Apr 23 17:41:56.450829 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:56.450728 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert podName:4c1413a9-cb8c-4052-b6e8-03a3e48b07e4 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:57.450708456 +0000 UTC m=+36.200150255 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mndcn" (UID: "4c1413a9-cb8c-4052-b6e8-03a3e48b07e4") : secret "networking-console-plugin-cert" not found Apr 23 17:41:56.450829 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:56.450790 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls podName:3160553d-dd23-411d-afee-16ff4b1a7ca7 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:57.450777745 +0000 UTC m=+36.200219527 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls") pod "dns-default-lg22b" (UID: "3160553d-dd23-411d-afee-16ff4b1a7ca7") : secret "dns-default-metrics-tls" not found Apr 23 17:41:56.450829 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:56.450802 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert podName:e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:57.450795862 +0000 UTC m=+36.200237641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert") pod "ingress-canary-jw7bs" (UID: "e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9") : secret "canary-serving-cert" not found Apr 23 17:41:56.450829 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:56.450812 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls podName:a8560822-ef3b-47a7-b587-297593afaeeb nodeName:}" failed. No retries permitted until 2026-04-23 17:41:57.450806226 +0000 UTC m=+36.200248005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls") pod "image-registry-747847cfd6-g5xmr" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb") : secret "image-registry-tls" not found Apr 23 17:41:56.786717 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:56.786638 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:41:56.786842 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:56.786638 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:41:56.791177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:56.791154 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:41:56.791177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:56.791173 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gwf2x\"" Apr 23 17:41:56.791372 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:56.791174 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wdbsc\"" Apr 23 17:41:56.791372 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:56.791210 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:41:56.791372 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:56.791215 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:41:56.998177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:56.998142 2581 generic.go:358] "Generic (PLEG): container finished" podID="cc5f4800-7f3a-482a-8059-189a6072df5a" containerID="bf7153e51f564831711cbb55ea6ed45bd9be159825d846ef5fc776810ee98916" exitCode=0 Apr 23 17:41:56.998606 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:56.998202 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xcnjl" event={"ID":"cc5f4800-7f3a-482a-8059-189a6072df5a","Type":"ContainerDied","Data":"bf7153e51f564831711cbb55ea6ed45bd9be159825d846ef5fc776810ee98916"} Apr 23 17:41:57.459513 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:57.459444 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:57.459513 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:57.459480 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:41:57.459715 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:57.459517 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:57.459715 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:57.459551 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:41:57.459715 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:57.459616 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:57.459715 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:57.459644 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 17:41:57.459715 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:57.459643 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:41:57.459715 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:57.459666 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747847cfd6-g5xmr: secret "image-registry-tls" not found Apr 23 17:41:57.459715 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:57.459616 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:57.459715 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:57.459678 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls podName:3160553d-dd23-411d-afee-16ff4b1a7ca7 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:59.45966401 +0000 UTC m=+38.209105789 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls") pod "dns-default-lg22b" (UID: "3160553d-dd23-411d-afee-16ff4b1a7ca7") : secret "dns-default-metrics-tls" not found Apr 23 17:41:57.459715 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:57.459714 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls podName:a8560822-ef3b-47a7-b587-297593afaeeb nodeName:}" failed. No retries permitted until 2026-04-23 17:41:59.459694882 +0000 UTC m=+38.209136676 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls") pod "image-registry-747847cfd6-g5xmr" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb") : secret "image-registry-tls" not found Apr 23 17:41:57.460045 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:57.459733 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert podName:4c1413a9-cb8c-4052-b6e8-03a3e48b07e4 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:59.459723538 +0000 UTC m=+38.209165318 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mndcn" (UID: "4c1413a9-cb8c-4052-b6e8-03a3e48b07e4") : secret "networking-console-plugin-cert" not found Apr 23 17:41:57.460045 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:57.459750 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert podName:e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:59.459741864 +0000 UTC m=+38.209183674 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert") pod "ingress-canary-jw7bs" (UID: "e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9") : secret "canary-serving-cert" not found Apr 23 17:41:58.003109 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:58.003049 2581 generic.go:358] "Generic (PLEG): container finished" podID="cc5f4800-7f3a-482a-8059-189a6072df5a" containerID="f3a597cb1f3ddb18876bde4e140f013836711d36aea18ac87c49e0e339c3f4ad" exitCode=0 Apr 23 17:41:58.003701 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:58.003130 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xcnjl" event={"ID":"cc5f4800-7f3a-482a-8059-189a6072df5a","Type":"ContainerDied","Data":"f3a597cb1f3ddb18876bde4e140f013836711d36aea18ac87c49e0e339c3f4ad"} Apr 23 17:41:59.007886 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:59.007855 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xcnjl" event={"ID":"cc5f4800-7f3a-482a-8059-189a6072df5a","Type":"ContainerStarted","Data":"18ca90a14e1927fafe32ca95389f5ce6a93b1cb0948bacdb91004feaacb9d368"} Apr 23 17:41:59.038566 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:59.038521 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xcnjl" podStartSLOduration=5.62125722 podStartE2EDuration="37.038506654s" podCreationTimestamp="2026-04-23 17:41:22 +0000 UTC" firstStartedPulling="2026-04-23 17:41:24.425319114 +0000 UTC m=+3.174760892" lastFinishedPulling="2026-04-23 17:41:55.842568535 +0000 UTC m=+34.592010326" observedRunningTime="2026-04-23 17:41:59.036613857 +0000 UTC m=+37.786055658" watchObservedRunningTime="2026-04-23 17:41:59.038506654 +0000 UTC m=+37.787948455" Apr 23 17:41:59.473597 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:59.473564 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:41:59.473760 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:59.473619 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:41:59.473760 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:59.473683 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:41:59.473760 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:59.473702 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:59.473760 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:41:59.473740 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:41:59.473946 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:59.473803 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls podName:3160553d-dd23-411d-afee-16ff4b1a7ca7 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:03.473788388 +0000 UTC m=+42.223230167 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls") pod "dns-default-lg22b" (UID: "3160553d-dd23-411d-afee-16ff4b1a7ca7") : secret "dns-default-metrics-tls" not found Apr 23 17:41:59.473946 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:59.473845 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:59.473946 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:59.473858 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:41:59.473946 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:59.473874 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747847cfd6-g5xmr: secret "image-registry-tls" not found Apr 23 17:41:59.473946 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:59.473875 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 17:41:59.473946 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:59.473910 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert podName:e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:03.473895446 +0000 UTC m=+42.223337226 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert") pod "ingress-canary-jw7bs" (UID: "e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9") : secret "canary-serving-cert" not found Apr 23 17:41:59.473946 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:59.473925 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls podName:a8560822-ef3b-47a7-b587-297593afaeeb nodeName:}" failed. No retries permitted until 2026-04-23 17:42:03.473917816 +0000 UTC m=+42.223359594 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls") pod "image-registry-747847cfd6-g5xmr" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb") : secret "image-registry-tls" not found Apr 23 17:41:59.473946 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:41:59.473947 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert podName:4c1413a9-cb8c-4052-b6e8-03a3e48b07e4 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:03.473930246 +0000 UTC m=+42.223372044 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mndcn" (UID: "4c1413a9-cb8c-4052-b6e8-03a3e48b07e4") : secret "networking-console-plugin-cert" not found Apr 23 17:42:03.502970 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:03.502931 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:42:03.503355 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:03.503000 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:42:03.503355 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:03.503020 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:42:03.503355 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:03.503052 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:42:03.503355 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:03.503075 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 17:42:03.503355 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:03.503142 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert podName:4c1413a9-cb8c-4052-b6e8-03a3e48b07e4 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:11.503126788 +0000 UTC m=+50.252568567 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mndcn" (UID: "4c1413a9-cb8c-4052-b6e8-03a3e48b07e4") : secret "networking-console-plugin-cert" not found Apr 23 17:42:03.503355 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:03.503147 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:42:03.503355 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:03.503147 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:03.503355 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:03.503158 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747847cfd6-g5xmr: secret "image-registry-tls" not found Apr 23 17:42:03.503355 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:03.503175 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:03.503355 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:03.503205 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls podName:3160553d-dd23-411d-afee-16ff4b1a7ca7 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:11.503188994 +0000 UTC m=+50.252630794 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls") pod "dns-default-lg22b" (UID: "3160553d-dd23-411d-afee-16ff4b1a7ca7") : secret "dns-default-metrics-tls" not found Apr 23 17:42:03.503355 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:03.503225 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert podName:e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:11.503214074 +0000 UTC m=+50.252655873 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert") pod "ingress-canary-jw7bs" (UID: "e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9") : secret "canary-serving-cert" not found Apr 23 17:42:03.503355 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:03.503241 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls podName:a8560822-ef3b-47a7-b587-297593afaeeb nodeName:}" failed. No retries permitted until 2026-04-23 17:42:11.503233353 +0000 UTC m=+50.252675135 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls") pod "image-registry-747847cfd6-g5xmr" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb") : secret "image-registry-tls" not found Apr 23 17:42:11.563898 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:11.563856 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:42:11.564424 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:11.563914 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:42:11.564424 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:11.563945 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:42:11.564424 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:11.563976 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:42:11.564424 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:11.564083 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:42:11.564424 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:11.564097 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:11.564424 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:11.564103 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 17:42:11.564424 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:11.564104 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747847cfd6-g5xmr: secret "image-registry-tls" not found Apr 23 17:42:11.564424 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:11.564093 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:11.564424 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:11.564158 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert podName:4c1413a9-cb8c-4052-b6e8-03a3e48b07e4 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:27.564138352 +0000 UTC m=+66.313580131 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mndcn" (UID: "4c1413a9-cb8c-4052-b6e8-03a3e48b07e4") : secret "networking-console-plugin-cert" not found Apr 23 17:42:11.564424 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:11.564175 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls podName:3160553d-dd23-411d-afee-16ff4b1a7ca7 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:27.564166214 +0000 UTC m=+66.313607993 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls") pod "dns-default-lg22b" (UID: "3160553d-dd23-411d-afee-16ff4b1a7ca7") : secret "dns-default-metrics-tls" not found Apr 23 17:42:11.564424 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:11.564195 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls podName:a8560822-ef3b-47a7-b587-297593afaeeb nodeName:}" failed. No retries permitted until 2026-04-23 17:42:27.56418824 +0000 UTC m=+66.313630019 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls") pod "image-registry-747847cfd6-g5xmr" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb") : secret "image-registry-tls" not found Apr 23 17:42:11.564424 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:11.564216 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert podName:e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:27.564206904 +0000 UTC m=+66.313648682 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert") pod "ingress-canary-jw7bs" (UID: "e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9") : secret "canary-serving-cert" not found Apr 23 17:42:20.992585 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:20.992554 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b27s8" Apr 23 17:42:27.480115 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:27.480062 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:42:27.483184 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:27.483163 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:42:27.490751 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:27.490735 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:42:27.490841 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:27.490807 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs podName:34a08e1b-c531-4efa-80dc-60379820b460 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:31.490783006 +0000 UTC m=+130.240224785 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs") pod "network-metrics-daemon-dtrg9" (UID: "34a08e1b-c531-4efa-80dc-60379820b460") : secret "metrics-daemon-secret" not found Apr 23 17:42:27.580654 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:27.580606 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:42:27.580800 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:27.580669 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:42:27.580800 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:27.580704 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:42:27.580800 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:27.580727 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:42:27.580800 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:27.580744 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:42:27.580800 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:27.580765 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747847cfd6-g5xmr: secret "image-registry-tls" not found Apr 23 17:42:27.580955 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:27.580805 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:27.580955 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:27.580806 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 17:42:27.580955 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:27.580823 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:27.580955 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:27.580862 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls podName:a8560822-ef3b-47a7-b587-297593afaeeb nodeName:}" failed. No retries permitted until 2026-04-23 17:42:59.580841838 +0000 UTC m=+98.330283633 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls") pod "image-registry-747847cfd6-g5xmr" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb") : secret "image-registry-tls" not found Apr 23 17:42:27.580955 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:27.580880 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert podName:4c1413a9-cb8c-4052-b6e8-03a3e48b07e4 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:59.580871031 +0000 UTC m=+98.330312817 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mndcn" (UID: "4c1413a9-cb8c-4052-b6e8-03a3e48b07e4") : secret "networking-console-plugin-cert" not found Apr 23 17:42:27.580955 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:27.580891 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert podName:e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:59.580885625 +0000 UTC m=+98.330327403 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert") pod "ingress-canary-jw7bs" (UID: "e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9") : secret "canary-serving-cert" not found Apr 23 17:42:27.580955 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:27.580900 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls podName:3160553d-dd23-411d-afee-16ff4b1a7ca7 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:59.580895099 +0000 UTC m=+98.330336878 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls") pod "dns-default-lg22b" (UID: "3160553d-dd23-411d-afee-16ff4b1a7ca7") : secret "dns-default-metrics-tls" not found Apr 23 17:42:27.681936 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:27.681903 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f664q\" (UniqueName: \"kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q\") pod \"network-check-target-9p89h\" (UID: \"93857015-7301-4fa2-b52a-008e134f997b\") " pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:42:27.684819 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:27.684798 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:42:27.694750 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:27.694733 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:42:27.704816 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:27.704794 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f664q\" (UniqueName: \"kubernetes.io/projected/93857015-7301-4fa2-b52a-008e134f997b-kube-api-access-f664q\") pod \"network-check-target-9p89h\" (UID: \"93857015-7301-4fa2-b52a-008e134f997b\") " pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:42:28.003570 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:28.003539 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gwf2x\"" Apr 23 17:42:28.011205 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:28.011186 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:42:28.135261 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:28.135230 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9p89h"] Apr 23 17:42:28.138373 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:42:28.138342 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93857015_7301_4fa2_b52a_008e134f997b.slice/crio-00785acf7a6236f5fe58f4f2e18cdd742b2502e51b09a400412159bb3cde1dd3 WatchSource:0}: Error finding container 00785acf7a6236f5fe58f4f2e18cdd742b2502e51b09a400412159bb3cde1dd3: Status 404 returned error can't find the container with id 00785acf7a6236f5fe58f4f2e18cdd742b2502e51b09a400412159bb3cde1dd3 Apr 23 17:42:29.066925 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:29.066886 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9p89h" event={"ID":"93857015-7301-4fa2-b52a-008e134f997b","Type":"ContainerStarted","Data":"00785acf7a6236f5fe58f4f2e18cdd742b2502e51b09a400412159bb3cde1dd3"} Apr 23 17:42:32.073625 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:32.073588 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9p89h" event={"ID":"93857015-7301-4fa2-b52a-008e134f997b","Type":"ContainerStarted","Data":"fe9f26042b12e44e516f687cc3e9d2ae0f5aedb5d2386afd0a9025fc486889af"} Apr 23 17:42:32.074085 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:32.073718 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:42:32.092178 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:32.092136 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9p89h" podStartSLOduration=66.904147296 podStartE2EDuration="1m10.092122768s" podCreationTimestamp="2026-04-23 17:41:22 +0000 UTC" firstStartedPulling="2026-04-23 17:42:28.140055397 +0000 UTC m=+66.889497179" lastFinishedPulling="2026-04-23 17:42:31.328030871 +0000 UTC m=+70.077472651" observedRunningTime="2026-04-23 17:42:32.091679523 +0000 UTC m=+70.841121324" watchObservedRunningTime="2026-04-23 17:42:32.092122768 +0000 UTC m=+70.841564563" Apr 23 17:42:59.610288 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:59.610230 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:42:59.610792 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:59.610314 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:42:59.610792 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:59.610345 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:42:59.610792 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:42:59.610377 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:42:59.610792 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:59.610404 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 17:42:59.610792 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:59.610432 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:59.610792 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:59.610471 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:59.610792 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:59.610484 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:42:59.610792 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:59.610501 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747847cfd6-g5xmr: secret "image-registry-tls" not found Apr 23 17:42:59.610792 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:59.610486 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert podName:4c1413a9-cb8c-4052-b6e8-03a3e48b07e4 nodeName:}" failed. No retries permitted until 2026-04-23 17:44:03.610464892 +0000 UTC m=+162.359906694 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mndcn" (UID: "4c1413a9-cb8c-4052-b6e8-03a3e48b07e4") : secret "networking-console-plugin-cert" not found Apr 23 17:42:59.610792 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:59.610563 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls podName:3160553d-dd23-411d-afee-16ff4b1a7ca7 nodeName:}" failed. No retries permitted until 2026-04-23 17:44:03.610548021 +0000 UTC m=+162.359989800 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls") pod "dns-default-lg22b" (UID: "3160553d-dd23-411d-afee-16ff4b1a7ca7") : secret "dns-default-metrics-tls" not found Apr 23 17:42:59.610792 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:59.610579 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert podName:e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9 nodeName:}" failed. No retries permitted until 2026-04-23 17:44:03.610568757 +0000 UTC m=+162.360010541 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert") pod "ingress-canary-jw7bs" (UID: "e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9") : secret "canary-serving-cert" not found Apr 23 17:42:59.610792 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:42:59.610594 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls podName:a8560822-ef3b-47a7-b587-297593afaeeb nodeName:}" failed. No retries permitted until 2026-04-23 17:44:03.610585361 +0000 UTC m=+162.360027141 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls") pod "image-registry-747847cfd6-g5xmr" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb") : secret "image-registry-tls" not found Apr 23 17:43:03.077520 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:03.077485 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9p89h" Apr 23 17:43:27.492563 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.492526 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-p5w64"] Apr 23 17:43:27.494312 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.494293 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p5w64" Apr 23 17:43:27.497409 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.497388 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 17:43:27.508112 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.508089 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-p5w64"] Apr 23 17:43:27.613442 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.613411 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/89eab3e6-08c4-458c-99c5-72cda238f366-kubelet-config\") pod \"global-pull-secret-syncer-p5w64\" (UID: \"89eab3e6-08c4-458c-99c5-72cda238f366\") " pod="kube-system/global-pull-secret-syncer-p5w64" Apr 23 17:43:27.613442 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.613446 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/89eab3e6-08c4-458c-99c5-72cda238f366-dbus\") pod \"global-pull-secret-syncer-p5w64\" (UID: \"89eab3e6-08c4-458c-99c5-72cda238f366\") " pod="kube-system/global-pull-secret-syncer-p5w64" Apr 23 17:43:27.613676 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.613560 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/89eab3e6-08c4-458c-99c5-72cda238f366-original-pull-secret\") pod \"global-pull-secret-syncer-p5w64\" (UID: \"89eab3e6-08c4-458c-99c5-72cda238f366\") " pod="kube-system/global-pull-secret-syncer-p5w64" Apr 23 17:43:27.713939 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.713906 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/89eab3e6-08c4-458c-99c5-72cda238f366-kubelet-config\") pod \"global-pull-secret-syncer-p5w64\" (UID: \"89eab3e6-08c4-458c-99c5-72cda238f366\") " pod="kube-system/global-pull-secret-syncer-p5w64" Apr 23 17:43:27.713939 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.713947 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/89eab3e6-08c4-458c-99c5-72cda238f366-dbus\") pod \"global-pull-secret-syncer-p5w64\" (UID: \"89eab3e6-08c4-458c-99c5-72cda238f366\") " pod="kube-system/global-pull-secret-syncer-p5w64" Apr 23 17:43:27.714232 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.714062 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/89eab3e6-08c4-458c-99c5-72cda238f366-kubelet-config\") pod \"global-pull-secret-syncer-p5w64\" (UID: \"89eab3e6-08c4-458c-99c5-72cda238f366\") " pod="kube-system/global-pull-secret-syncer-p5w64" Apr 23 17:43:27.714232 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.714103 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/89eab3e6-08c4-458c-99c5-72cda238f366-original-pull-secret\") pod \"global-pull-secret-syncer-p5w64\" (UID: \"89eab3e6-08c4-458c-99c5-72cda238f366\") " pod="kube-system/global-pull-secret-syncer-p5w64" Apr 23 17:43:27.714232 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.714148 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/89eab3e6-08c4-458c-99c5-72cda238f366-dbus\") pod \"global-pull-secret-syncer-p5w64\" (UID: \"89eab3e6-08c4-458c-99c5-72cda238f366\") " pod="kube-system/global-pull-secret-syncer-p5w64" Apr 23 17:43:27.716482 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.716466 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/89eab3e6-08c4-458c-99c5-72cda238f366-original-pull-secret\") pod \"global-pull-secret-syncer-p5w64\" (UID: \"89eab3e6-08c4-458c-99c5-72cda238f366\") " pod="kube-system/global-pull-secret-syncer-p5w64" Apr 23 17:43:27.803424 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.803340 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p5w64" Apr 23 17:43:27.930193 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:27.928867 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-p5w64"] Apr 23 17:43:28.176093 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:28.176055 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-p5w64" event={"ID":"89eab3e6-08c4-458c-99c5-72cda238f366","Type":"ContainerStarted","Data":"f6bc976f6cd5812417c587a0f0aabba721dcc1c31bb656504e003de8106592c3"} Apr 23 17:43:31.543602 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:31.543559 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:43:31.544017 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:43:31.543676 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:43:31.544017 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:43:31.543730 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs podName:34a08e1b-c531-4efa-80dc-60379820b460 nodeName:}" failed. No retries permitted until 2026-04-23 17:45:33.543716273 +0000 UTC m=+252.293158057 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs") pod "network-metrics-daemon-dtrg9" (UID: "34a08e1b-c531-4efa-80dc-60379820b460") : secret "metrics-daemon-secret" not found Apr 23 17:43:32.185046 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:32.185007 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-p5w64" event={"ID":"89eab3e6-08c4-458c-99c5-72cda238f366","Type":"ContainerStarted","Data":"d1bef67d43e971e71f0baaaa466b223bd636a505b5ee94574ee9e989821839e6"} Apr 23 17:43:40.595133 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.595080 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-p5w64" podStartSLOduration=10.198069979 podStartE2EDuration="13.595063511s" podCreationTimestamp="2026-04-23 17:43:27 +0000 UTC" firstStartedPulling="2026-04-23 17:43:27.931961538 +0000 UTC m=+126.681403322" lastFinishedPulling="2026-04-23 17:43:31.328955074 +0000 UTC m=+130.078396854" observedRunningTime="2026-04-23 17:43:32.206517844 +0000 UTC m=+130.955959645" watchObservedRunningTime="2026-04-23 17:43:40.595063511 +0000 UTC m=+139.344505311" Apr 23 17:43:40.595584 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.595242 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-6gf74"] Apr 23 17:43:40.597052 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.597038 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.600588 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.600565 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 17:43:40.601318 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.600741 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 17:43:40.601318 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.601137 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-8h68b\"" Apr 23 17:43:40.603131 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.602929 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 17:43:40.603131 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.603022 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 17:43:40.607528 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.607509 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 17:43:40.624243 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.624194 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-6gf74"] Apr 23 17:43:40.708568 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.708530 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/58c83603-7870-44e5-ab1f-6751e80f8562-snapshots\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.708733 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.708597 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58c83603-7870-44e5-ab1f-6751e80f8562-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.708733 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.708624 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7lbk\" (UniqueName: \"kubernetes.io/projected/58c83603-7870-44e5-ab1f-6751e80f8562-kube-api-access-d7lbk\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.708733 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.708656 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58c83603-7870-44e5-ab1f-6751e80f8562-serving-cert\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.708733 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.708681 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58c83603-7870-44e5-ab1f-6751e80f8562-service-ca-bundle\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.708733 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.708726 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58c83603-7870-44e5-ab1f-6751e80f8562-tmp\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.809969 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.809926 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58c83603-7870-44e5-ab1f-6751e80f8562-serving-cert\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.810150 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.809977 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58c83603-7870-44e5-ab1f-6751e80f8562-service-ca-bundle\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.810150 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.810063 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58c83603-7870-44e5-ab1f-6751e80f8562-tmp\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.810220 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.810161 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/58c83603-7870-44e5-ab1f-6751e80f8562-snapshots\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.810255 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.810235 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58c83603-7870-44e5-ab1f-6751e80f8562-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.810303 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.810262 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7lbk\" (UniqueName: \"kubernetes.io/projected/58c83603-7870-44e5-ab1f-6751e80f8562-kube-api-access-d7lbk\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.810546 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.810526 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58c83603-7870-44e5-ab1f-6751e80f8562-tmp\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.810768 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.810753 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58c83603-7870-44e5-ab1f-6751e80f8562-service-ca-bundle\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.810818 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.810792 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/58c83603-7870-44e5-ab1f-6751e80f8562-snapshots\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.812524 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.812503 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58c83603-7870-44e5-ab1f-6751e80f8562-serving-cert\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.812694 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.812674 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58c83603-7870-44e5-ab1f-6751e80f8562-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.819426 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.819405 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7lbk\" (UniqueName: \"kubernetes.io/projected/58c83603-7870-44e5-ab1f-6751e80f8562-kube-api-access-d7lbk\") pod \"insights-operator-585dfdc468-6gf74\" (UID: \"58c83603-7870-44e5-ab1f-6751e80f8562\") " pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:40.907696 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:40.907648 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-6gf74" Apr 23 17:43:41.022429 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:41.022398 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-6gf74"] Apr 23 17:43:41.025172 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:43:41.025143 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c83603_7870_44e5_ab1f_6751e80f8562.slice/crio-3cf99271a4c28d21bb61979a80768aeb349b63d7edcbe91b9865163722754b9d WatchSource:0}: Error finding container 3cf99271a4c28d21bb61979a80768aeb349b63d7edcbe91b9865163722754b9d: Status 404 returned error can't find the container with id 3cf99271a4c28d21bb61979a80768aeb349b63d7edcbe91b9865163722754b9d Apr 23 17:43:41.200649 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:41.200566 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-6gf74" event={"ID":"58c83603-7870-44e5-ab1f-6751e80f8562","Type":"ContainerStarted","Data":"3cf99271a4c28d21bb61979a80768aeb349b63d7edcbe91b9865163722754b9d"} Apr 23 17:43:43.205825 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:43.205791 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-6gf74" event={"ID":"58c83603-7870-44e5-ab1f-6751e80f8562","Type":"ContainerStarted","Data":"ef955a90cc32daec8f9cd148fd6f58b4e323f16868ace5abf9c09f16c4d60a28"} Apr 23 17:43:43.225310 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:43.225263 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-6gf74" podStartSLOduration=1.600524864 podStartE2EDuration="3.225249893s" podCreationTimestamp="2026-04-23 17:43:40 +0000 UTC" firstStartedPulling="2026-04-23 17:43:41.027121938 +0000 UTC m=+139.776563718" lastFinishedPulling="2026-04-23 17:43:42.651846969 +0000 UTC m=+141.401288747" observedRunningTime="2026-04-23 17:43:43.224515613 +0000 UTC m=+141.973957415" watchObservedRunningTime="2026-04-23 17:43:43.225249893 +0000 UTC m=+141.974691693" Apr 23 17:43:46.525578 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:46.525551 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mtx92_8e736d0c-df1a-43ad-8265-6b52f1993b03/dns-node-resolver/0.log" Apr 23 17:43:47.526711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:47.526684 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6lc2b_600448d4-6de5-4739-b9f6-281d9613622c/node-ca/0.log" Apr 23 17:43:50.125435 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:50.125400 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-tr4s7_dc9e19bd-f4ca-4fa7-8442-d784513fd798/csi-driver/0.log" Apr 23 17:43:50.325204 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:50.325180 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-tr4s7_dc9e19bd-f4ca-4fa7-8442-d784513fd798/csi-node-driver-registrar/0.log" Apr 23 17:43:50.524874 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:50.524844 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-tr4s7_dc9e19bd-f4ca-4fa7-8442-d784513fd798/csi-liveness-probe/0.log" Apr 23 17:43:58.696177 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:43:58.696135 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" Apr 23 17:43:58.709254 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:43:58.709223 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" podUID="4c1413a9-cb8c-4052-b6e8-03a3e48b07e4" Apr 23 17:43:58.729433 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:43:58.729404 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lg22b" podUID="3160553d-dd23-411d-afee-16ff4b1a7ca7" Apr 23 17:43:58.746695 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:43:58.746673 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jw7bs" podUID="e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9" Apr 23 17:43:59.235588 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:59.235562 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:43:59.235757 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:59.235562 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lg22b" Apr 23 17:43:59.235757 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:43:59.235562 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:43:59.796042 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:43:59.796007 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-dtrg9" podUID="34a08e1b-c531-4efa-80dc-60379820b460" Apr 23 17:44:03.687323 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.687280 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:44:03.687818 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.687346 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:44:03.687818 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.687369 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:44:03.687818 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.687396 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:44:03.689920 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.689897 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9-cert\") pod \"ingress-canary-jw7bs\" (UID: \"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9\") " pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:44:03.690045 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.689956 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls\") pod \"image-registry-747847cfd6-g5xmr\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:44:03.690263 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.690246 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3160553d-dd23-411d-afee-16ff4b1a7ca7-metrics-tls\") pod \"dns-default-lg22b\" (UID: \"3160553d-dd23-411d-afee-16ff4b1a7ca7\") " pod="openshift-dns/dns-default-lg22b" Apr 23 17:44:03.690352 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.690334 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4c1413a9-cb8c-4052-b6e8-03a3e48b07e4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mndcn\" (UID: \"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:44:03.740961 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.740937 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-wmv2k\"" Apr 23 17:44:03.741095 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.740972 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zw6qq\"" Apr 23 17:44:03.741095 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.740976 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9m6lq\"" Apr 23 17:44:03.747383 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.747362 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:44:03.747383 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.747382 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" Apr 23 17:44:03.747514 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.747371 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lg22b" Apr 23 17:44:03.888186 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.888159 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mndcn"] Apr 23 17:44:03.890228 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:44:03.890201 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c1413a9_cb8c_4052_b6e8_03a3e48b07e4.slice/crio-87a6fdd9453a318542d6632b7aba4a321b520f2b8fe2ea2fb4cac5149a72fe9b WatchSource:0}: Error finding container 87a6fdd9453a318542d6632b7aba4a321b520f2b8fe2ea2fb4cac5149a72fe9b: Status 404 returned error can't find the container with id 87a6fdd9453a318542d6632b7aba4a321b520f2b8fe2ea2fb4cac5149a72fe9b Apr 23 17:44:03.907706 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.907684 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-747847cfd6-g5xmr"] Apr 23 17:44:03.910316 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:44:03.910288 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8560822_ef3b_47a7_b587_297593afaeeb.slice/crio-49538decf0c47c508dc21301230c42459f488465c8cf81649a5346c250c80262 WatchSource:0}: Error finding container 49538decf0c47c508dc21301230c42459f488465c8cf81649a5346c250c80262: Status 404 returned error can't find the container with id 49538decf0c47c508dc21301230c42459f488465c8cf81649a5346c250c80262 Apr 23 17:44:03.921225 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:03.921202 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lg22b"] Apr 23 17:44:03.924471 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:44:03.924450 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3160553d_dd23_411d_afee_16ff4b1a7ca7.slice/crio-be8ce7fec843fb22dc8e860597097aa18f1b20c8743d5a18027c529a6912ee3e WatchSource:0}: Error finding container be8ce7fec843fb22dc8e860597097aa18f1b20c8743d5a18027c529a6912ee3e: Status 404 returned error can't find the container with id be8ce7fec843fb22dc8e860597097aa18f1b20c8743d5a18027c529a6912ee3e Apr 23 17:44:04.246677 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:04.246594 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" event={"ID":"a8560822-ef3b-47a7-b587-297593afaeeb","Type":"ContainerStarted","Data":"476b18a61bdf32780aa536bcb020f2edfadad27366172df40c6465b3f642f77a"} Apr 23 17:44:04.246677 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:04.246631 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" event={"ID":"a8560822-ef3b-47a7-b587-297593afaeeb","Type":"ContainerStarted","Data":"49538decf0c47c508dc21301230c42459f488465c8cf81649a5346c250c80262"} Apr 23 17:44:04.246677 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:04.246665 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:44:04.247760 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:04.247730 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" event={"ID":"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4","Type":"ContainerStarted","Data":"87a6fdd9453a318542d6632b7aba4a321b520f2b8fe2ea2fb4cac5149a72fe9b"} Apr 23 17:44:04.248665 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:04.248638 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lg22b" event={"ID":"3160553d-dd23-411d-afee-16ff4b1a7ca7","Type":"ContainerStarted","Data":"be8ce7fec843fb22dc8e860597097aa18f1b20c8743d5a18027c529a6912ee3e"} Apr 23 17:44:04.275610 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:04.275563 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podStartSLOduration=162.275548415 podStartE2EDuration="2m42.275548415s" podCreationTimestamp="2026-04-23 17:41:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:44:04.274578535 +0000 UTC m=+163.024020335" watchObservedRunningTime="2026-04-23 17:44:04.275548415 +0000 UTC m=+163.024990215" Apr 23 17:44:06.255123 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:06.255029 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lg22b" event={"ID":"3160553d-dd23-411d-afee-16ff4b1a7ca7","Type":"ContainerStarted","Data":"4b5b1b3b7eb572170bb3c9ecbc3bfe16c660e639fcc53e02b2c62c359d418cf1"} Apr 23 17:44:06.255123 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:06.255072 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lg22b" event={"ID":"3160553d-dd23-411d-afee-16ff4b1a7ca7","Type":"ContainerStarted","Data":"b3367e487259def4f2b06dc6b6481c7ce9c5cc350667abf4cb75ef9540a8da5d"} Apr 23 17:44:06.255562 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:06.255297 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lg22b" Apr 23 17:44:06.256265 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:06.256245 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" event={"ID":"4c1413a9-cb8c-4052-b6e8-03a3e48b07e4","Type":"ContainerStarted","Data":"f2505f5f09dd13fea6f36a7b34d53f36a4c392e1ebec3d7e9e9c0e0b82b53c55"} Apr 23 17:44:06.279933 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:06.279894 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lg22b" podStartSLOduration=129.454120339 podStartE2EDuration="2m11.279881559s" podCreationTimestamp="2026-04-23 17:41:55 +0000 UTC" firstStartedPulling="2026-04-23 17:44:03.9261257 +0000 UTC m=+162.675567481" lastFinishedPulling="2026-04-23 17:44:05.751886921 +0000 UTC m=+164.501328701" observedRunningTime="2026-04-23 17:44:06.278643675 +0000 UTC m=+165.028085478" watchObservedRunningTime="2026-04-23 17:44:06.279881559 +0000 UTC m=+165.029323361" Apr 23 17:44:06.298301 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:06.298263 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mndcn" podStartSLOduration=157.44322404 podStartE2EDuration="2m39.298251394s" podCreationTimestamp="2026-04-23 17:41:27 +0000 UTC" firstStartedPulling="2026-04-23 17:44:03.893363438 +0000 UTC m=+162.642805220" lastFinishedPulling="2026-04-23 17:44:05.748390795 +0000 UTC m=+164.497832574" observedRunningTime="2026-04-23 17:44:06.297866494 +0000 UTC m=+165.047308298" watchObservedRunningTime="2026-04-23 17:44:06.298251394 +0000 UTC m=+165.047693196" Apr 23 17:44:10.786960 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:10.786868 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:44:12.786509 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:12.786457 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:44:12.789978 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:12.789958 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-576jd\"" Apr 23 17:44:12.797763 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:12.797744 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jw7bs" Apr 23 17:44:12.913695 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:12.913661 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jw7bs"] Apr 23 17:44:12.917031 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:44:12.916965 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode21a3d10_f4dc_4d03_8e8e_3718d1cb9ef9.slice/crio-c58f05abbe5b077242ddb3444ee0a6982d313aec7d44554ea01cb71153e688f4 WatchSource:0}: Error finding container c58f05abbe5b077242ddb3444ee0a6982d313aec7d44554ea01cb71153e688f4: Status 404 returned error can't find the container with id c58f05abbe5b077242ddb3444ee0a6982d313aec7d44554ea01cb71153e688f4 Apr 23 17:44:13.275508 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:13.275469 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jw7bs" event={"ID":"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9","Type":"ContainerStarted","Data":"c58f05abbe5b077242ddb3444ee0a6982d313aec7d44554ea01cb71153e688f4"} Apr 23 17:44:15.281019 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:15.280971 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jw7bs" event={"ID":"e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9","Type":"ContainerStarted","Data":"e25fe14d613e1ead7dd2bad065fa506bdf179c781971199afc858d9a3a016c44"} Apr 23 17:44:15.302563 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:15.302520 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jw7bs" podStartSLOduration=138.761586177 podStartE2EDuration="2m20.302507313s" podCreationTimestamp="2026-04-23 17:41:55 +0000 UTC" firstStartedPulling="2026-04-23 17:44:12.918796522 +0000 UTC m=+171.668238300" lastFinishedPulling="2026-04-23 17:44:14.459717656 +0000 UTC m=+173.209159436" observedRunningTime="2026-04-23 17:44:15.302138861 +0000 UTC m=+174.051580662" watchObservedRunningTime="2026-04-23 17:44:15.302507313 +0000 UTC m=+174.051949155" Apr 23 17:44:16.261552 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:16.261520 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lg22b" Apr 23 17:44:23.751940 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:23.751905 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:23.752351 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:23.751957 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:25.255545 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:25.255514 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:25.255895 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:25.255562 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:33.751492 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:33.751460 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:33.751867 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:33.751519 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:35.256883 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:35.256850 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:35.257298 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:35.256911 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:43.751620 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:43.751581 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:43.752095 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:43.751637 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:43.752095 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:43.751675 2581 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:44:43.752203 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:43.752083 2581 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"476b18a61bdf32780aa536bcb020f2edfadad27366172df40c6465b3f642f77a"} pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" containerMessage="Container registry failed liveness probe, will be restarted" Apr 23 17:44:43.755311 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:43.755283 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:43.755427 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:43.755324 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:53.755789 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:53.755753 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:53.756250 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:53.755820 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:58.387699 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:58.387662 2581 generic.go:358] "Generic (PLEG): container finished" podID="58c83603-7870-44e5-ab1f-6751e80f8562" containerID="ef955a90cc32daec8f9cd148fd6f58b4e323f16868ace5abf9c09f16c4d60a28" exitCode=0 Apr 23 17:44:58.388114 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:58.387704 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-6gf74" event={"ID":"58c83603-7870-44e5-ab1f-6751e80f8562","Type":"ContainerDied","Data":"ef955a90cc32daec8f9cd148fd6f58b4e323f16868ace5abf9c09f16c4d60a28"} Apr 23 17:44:58.388114 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:58.388017 2581 scope.go:117] "RemoveContainer" containerID="ef955a90cc32daec8f9cd148fd6f58b4e323f16868ace5abf9c09f16c4d60a28" Apr 23 17:44:59.393839 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:44:59.393803 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-6gf74" event={"ID":"58c83603-7870-44e5-ab1f-6751e80f8562","Type":"ContainerStarted","Data":"bcc657dea51176af51e1ddd6bf45679c2dd105d64aede389b1fa799881b961af"} Apr 23 17:45:02.017650 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:02.017610 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-mndcn_4c1413a9-cb8c-4052-b6e8-03a3e48b07e4/networking-console-plugin/0.log" Apr 23 17:45:03.418529 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:03.418504 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lg22b_3160553d-dd23-411d-afee-16ff4b1a7ca7/dns/0.log" Apr 23 17:45:03.618545 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:03.618507 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lg22b_3160553d-dd23-411d-afee-16ff4b1a7ca7/kube-rbac-proxy/0.log" Apr 23 17:45:03.756162 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:03.756093 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:03.756162 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:03.756143 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:04.017536 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:04.017463 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mtx92_8e736d0c-df1a-43ad-8265-6b52f1993b03/dns-node-resolver/0.log" Apr 23 17:45:04.618630 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:04.618592 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-747847cfd6-g5xmr_a8560822-ef3b-47a7-b587-297593afaeeb/registry/0.log" Apr 23 17:45:05.017913 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:05.017878 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6lc2b_600448d4-6de5-4739-b9f6-281d9613622c/node-ca/0.log" Apr 23 17:45:05.816727 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:05.816700 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jw7bs_e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9/serve-healthcheck-canary/0.log" Apr 23 17:45:07.618187 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:07.618134 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-tr4s7_dc9e19bd-f4ca-4fa7-8442-d784513fd798/csi-driver/0.log" Apr 23 17:45:07.823915 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:07.823876 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-tr4s7_dc9e19bd-f4ca-4fa7-8442-d784513fd798/csi-node-driver-registrar/0.log" Apr 23 17:45:08.017857 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:08.017823 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-tr4s7_dc9e19bd-f4ca-4fa7-8442-d784513fd798/csi-liveness-probe/0.log" Apr 23 17:45:08.770427 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:08.770379 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" containerID="cri-o://476b18a61bdf32780aa536bcb020f2edfadad27366172df40c6465b3f642f77a" gracePeriod=30 Apr 23 17:45:09.419666 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:09.419630 2581 generic.go:358] "Generic (PLEG): container finished" podID="a8560822-ef3b-47a7-b587-297593afaeeb" containerID="476b18a61bdf32780aa536bcb020f2edfadad27366172df40c6465b3f642f77a" exitCode=0 Apr 23 17:45:09.419935 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:09.419698 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" event={"ID":"a8560822-ef3b-47a7-b587-297593afaeeb","Type":"ContainerDied","Data":"476b18a61bdf32780aa536bcb020f2edfadad27366172df40c6465b3f642f77a"} Apr 23 17:45:09.419935 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:09.419730 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" event={"ID":"a8560822-ef3b-47a7-b587-297593afaeeb","Type":"ContainerStarted","Data":"8a0502c66652199189f884ac2bd19b0d46ea53ef223d4da7ca4814be59529d04"} Apr 23 17:45:09.419935 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:09.419904 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:45:23.751321 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:23.751289 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:23.751699 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:23.751345 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:30.426777 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:30.426740 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:30.427243 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:30.426802 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:33.613842 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:33.613796 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:45:33.616181 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:33.616163 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a08e1b-c531-4efa-80dc-60379820b460-metrics-certs\") pod \"network-metrics-daemon-dtrg9\" (UID: \"34a08e1b-c531-4efa-80dc-60379820b460\") " pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:45:33.751031 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:33.750996 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:33.751172 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:33.751059 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:33.891667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:33.891642 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wdbsc\"" Apr 23 17:45:33.898481 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:33.898456 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dtrg9" Apr 23 17:45:34.017071 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:34.017039 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dtrg9"] Apr 23 17:45:34.019735 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:45:34.019696 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34a08e1b_c531_4efa_80dc_60379820b460.slice/crio-b4e1fb568de57e99567d7051c5434d6bb6e144d011d425f7df4f96baa3c40e02 WatchSource:0}: Error finding container b4e1fb568de57e99567d7051c5434d6bb6e144d011d425f7df4f96baa3c40e02: Status 404 returned error can't find the container with id b4e1fb568de57e99567d7051c5434d6bb6e144d011d425f7df4f96baa3c40e02 Apr 23 17:45:34.479360 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:34.479325 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dtrg9" event={"ID":"34a08e1b-c531-4efa-80dc-60379820b460","Type":"ContainerStarted","Data":"b4e1fb568de57e99567d7051c5434d6bb6e144d011d425f7df4f96baa3c40e02"} Apr 23 17:45:35.483730 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:35.483693 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dtrg9" event={"ID":"34a08e1b-c531-4efa-80dc-60379820b460","Type":"ContainerStarted","Data":"b1b1da1ae89f21d72d30c92c8f5bd93df891b334e9912ba033a05af36a6f8d7c"} Apr 23 17:45:35.483730 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:35.483730 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dtrg9" event={"ID":"34a08e1b-c531-4efa-80dc-60379820b460","Type":"ContainerStarted","Data":"1300448d6eb4398058e9258d4d96dad0590f1a040b7aa28555758c0c4b35b639"} Apr 23 17:45:35.504558 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:35.504512 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dtrg9" podStartSLOduration=252.462608448 podStartE2EDuration="4m13.50450009s" podCreationTimestamp="2026-04-23 17:41:22 +0000 UTC" firstStartedPulling="2026-04-23 17:45:34.021480595 +0000 UTC m=+252.770922374" lastFinishedPulling="2026-04-23 17:45:35.063372223 +0000 UTC m=+253.812814016" observedRunningTime="2026-04-23 17:45:35.503401607 +0000 UTC m=+254.252843403" watchObservedRunningTime="2026-04-23 17:45:35.50450009 +0000 UTC m=+254.253941873" Apr 23 17:45:40.426441 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:40.426364 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:40.426441 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:40.426415 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:43.752031 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:43.751976 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:43.752548 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:43.752047 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:43.752548 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:43.752085 2581 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:45:43.752548 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:43.752523 2581 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"8a0502c66652199189f884ac2bd19b0d46ea53ef223d4da7ca4814be59529d04"} pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" containerMessage="Container registry failed liveness probe, will be restarted" Apr 23 17:45:43.755767 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:43.755734 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:43.755879 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:43.755784 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:53.756730 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:53.756695 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:53.757131 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:45:53.756754 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:03.756329 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:03.756288 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:03.756696 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:03.756356 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:08.770709 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:08.770668 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" containerID="cri-o://8a0502c66652199189f884ac2bd19b0d46ea53ef223d4da7ca4814be59529d04" gracePeriod=30 Apr 23 17:46:09.565312 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:09.565277 2581 generic.go:358] "Generic (PLEG): container finished" podID="a8560822-ef3b-47a7-b587-297593afaeeb" containerID="8a0502c66652199189f884ac2bd19b0d46ea53ef223d4da7ca4814be59529d04" exitCode=0 Apr 23 17:46:09.565491 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:09.565353 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" event={"ID":"a8560822-ef3b-47a7-b587-297593afaeeb","Type":"ContainerDied","Data":"8a0502c66652199189f884ac2bd19b0d46ea53ef223d4da7ca4814be59529d04"} Apr 23 17:46:09.565491 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:09.565390 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" event={"ID":"a8560822-ef3b-47a7-b587-297593afaeeb","Type":"ContainerStarted","Data":"eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2"} Apr 23 17:46:09.565491 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:09.565405 2581 scope.go:117] "RemoveContainer" containerID="476b18a61bdf32780aa536bcb020f2edfadad27366172df40c6465b3f642f77a" Apr 23 17:46:09.565652 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:09.565616 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:46:21.699226 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:21.699194 2581 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 17:46:23.751894 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:23.751863 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:23.753609 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:23.751918 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:30.574103 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:30.574055 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:30.574457 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:30.574123 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:33.752303 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:33.752273 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:33.752657 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:33.752321 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:40.574210 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:40.574177 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:40.574665 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:40.574229 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:42.497407 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.497371 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8cdd5bc7-hl5n8"] Apr 23 17:46:42.499571 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.499547 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.502887 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.502864 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 17:46:42.504004 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.503962 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 17:46:42.504004 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.503995 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 17:46:42.504185 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.503978 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 17:46:42.504254 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.504237 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 17:46:42.504396 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.504381 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 17:46:42.504570 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.504559 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-66cmr\"" Apr 23 17:46:42.504648 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.504633 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 17:46:42.509323 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.509307 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 17:46:42.522438 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.522412 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8cdd5bc7-hl5n8"] Apr 23 17:46:42.568025 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.567962 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg4nj\" (UniqueName: \"kubernetes.io/projected/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-kube-api-access-wg4nj\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.568172 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.568054 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-oauth-config\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.568172 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.568086 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-oauth-serving-cert\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.568172 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.568113 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-serving-cert\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.568172 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.568149 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-trusted-ca-bundle\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.568340 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.568179 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-service-ca\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.568340 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.568243 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-config\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.604511 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.604479 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-cl2hj"] Apr 23 17:46:42.606600 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.606580 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.610051 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.610029 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 17:46:42.610051 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.610049 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 17:46:42.610218 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.610180 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-j9sps\"" Apr 23 17:46:42.641648 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.641621 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cl2hj"] Apr 23 17:46:42.669136 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.669111 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c5a8d4ca-dd18-4b64-918c-68c8b1037169-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.669267 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.669147 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-oauth-config\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.669267 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.669164 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-oauth-serving-cert\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.669267 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.669181 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5a8d4ca-dd18-4b64-918c-68c8b1037169-data-volume\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.669267 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.669239 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-serving-cert\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.669406 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.669299 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c5a8d4ca-dd18-4b64-918c-68c8b1037169-crio-socket\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.669406 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.669323 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-trusted-ca-bundle\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.669406 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.669344 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-service-ca\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.669528 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.669413 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-config\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.669528 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.669455 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctxzk\" (UniqueName: \"kubernetes.io/projected/c5a8d4ca-dd18-4b64-918c-68c8b1037169-kube-api-access-ctxzk\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.669528 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.669512 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c5a8d4ca-dd18-4b64-918c-68c8b1037169-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.669675 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.669551 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wg4nj\" (UniqueName: \"kubernetes.io/projected/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-kube-api-access-wg4nj\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.670170 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.669943 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-oauth-serving-cert\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.670170 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.670009 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-service-ca\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.670170 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.670166 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-config\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.670439 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.670405 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-trusted-ca-bundle\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.671791 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.671775 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-oauth-config\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.671956 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.671941 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-serving-cert\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.682842 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.682825 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg4nj\" (UniqueName: \"kubernetes.io/projected/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-kube-api-access-wg4nj\") pod \"console-8cdd5bc7-hl5n8\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.771332 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.771258 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c5a8d4ca-dd18-4b64-918c-68c8b1037169-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.771332 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.771303 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c5a8d4ca-dd18-4b64-918c-68c8b1037169-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.771332 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.771323 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5a8d4ca-dd18-4b64-918c-68c8b1037169-data-volume\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.771565 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.771348 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c5a8d4ca-dd18-4b64-918c-68c8b1037169-crio-socket\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.771565 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.771390 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctxzk\" (UniqueName: \"kubernetes.io/projected/c5a8d4ca-dd18-4b64-918c-68c8b1037169-kube-api-access-ctxzk\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.771565 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.771500 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c5a8d4ca-dd18-4b64-918c-68c8b1037169-crio-socket\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.771686 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.771668 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5a8d4ca-dd18-4b64-918c-68c8b1037169-data-volume\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.771768 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.771752 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c5a8d4ca-dd18-4b64-918c-68c8b1037169-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.773629 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.773609 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c5a8d4ca-dd18-4b64-918c-68c8b1037169-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.794050 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.794022 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctxzk\" (UniqueName: \"kubernetes.io/projected/c5a8d4ca-dd18-4b64-918c-68c8b1037169-kube-api-access-ctxzk\") pod \"insights-runtime-extractor-cl2hj\" (UID: \"c5a8d4ca-dd18-4b64-918c-68c8b1037169\") " pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.810023 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.809998 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:42.915419 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.915394 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cl2hj" Apr 23 17:46:42.929203 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.929178 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8cdd5bc7-hl5n8"] Apr 23 17:46:42.931974 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:46:42.931938 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6ac3c44_9ac1_4a86_89cc_81579d2d7746.slice/crio-7ab61c2045eb680b1e35e799dc0fb306063d822e60358d756ba9ff9b6f2bd810 WatchSource:0}: Error finding container 7ab61c2045eb680b1e35e799dc0fb306063d822e60358d756ba9ff9b6f2bd810: Status 404 returned error can't find the container with id 7ab61c2045eb680b1e35e799dc0fb306063d822e60358d756ba9ff9b6f2bd810 Apr 23 17:46:42.933965 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:42.933947 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:46:43.035290 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:43.035212 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cl2hj"] Apr 23 17:46:43.038025 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:46:43.038001 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5a8d4ca_dd18_4b64_918c_68c8b1037169.slice/crio-050bb54e0ccd51b24cdf8205dd8c8a840e67971e967cc783b0e340dc3f511b96 WatchSource:0}: Error finding container 050bb54e0ccd51b24cdf8205dd8c8a840e67971e967cc783b0e340dc3f511b96: Status 404 returned error can't find the container with id 050bb54e0ccd51b24cdf8205dd8c8a840e67971e967cc783b0e340dc3f511b96 Apr 23 17:46:43.652765 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:43.652730 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cl2hj" event={"ID":"c5a8d4ca-dd18-4b64-918c-68c8b1037169","Type":"ContainerStarted","Data":"43fffca1a9300cd5e8728e862498f59f7395a4a5df0d3647b10fa54930a4694f"} Apr 23 17:46:43.653198 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:43.652779 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cl2hj" event={"ID":"c5a8d4ca-dd18-4b64-918c-68c8b1037169","Type":"ContainerStarted","Data":"050bb54e0ccd51b24cdf8205dd8c8a840e67971e967cc783b0e340dc3f511b96"} Apr 23 17:46:43.653841 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:43.653821 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8cdd5bc7-hl5n8" event={"ID":"d6ac3c44-9ac1-4a86-89cc-81579d2d7746","Type":"ContainerStarted","Data":"7ab61c2045eb680b1e35e799dc0fb306063d822e60358d756ba9ff9b6f2bd810"} Apr 23 17:46:43.751711 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:43.751667 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:43.751880 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:43.751722 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:43.751880 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:43.751768 2581 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:46:43.752262 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:43.752240 2581 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2"} pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" containerMessage="Container registry failed liveness probe, will be restarted" Apr 23 17:46:43.755959 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:43.755936 2581 patch_prober.go:28] interesting pod/image-registry-747847cfd6-g5xmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:43.756097 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:43.755977 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:44.659147 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:44.659107 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cl2hj" event={"ID":"c5a8d4ca-dd18-4b64-918c-68c8b1037169","Type":"ContainerStarted","Data":"f809a6936587e2f7c126d65a81073e9de7e81e0e97d3d27ac846622d506383b4"} Apr 23 17:46:45.047932 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.047839 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9dbkc"] Apr 23 17:46:45.050157 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.050130 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:45.056324 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.056295 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 17:46:45.056436 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.056343 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 17:46:45.056436 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.056344 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 17:46:45.056551 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.056303 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 17:46:45.056747 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.056726 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 17:46:45.056889 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.056731 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-6r2sr\"" Apr 23 17:46:45.072840 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.072815 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9dbkc"] Apr 23 17:46:45.191870 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.191735 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/32480919-8b56-4ecd-9904-1e7674fb05bd-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9dbkc\" (UID: \"32480919-8b56-4ecd-9904-1e7674fb05bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:45.191870 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.191827 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/32480919-8b56-4ecd-9904-1e7674fb05bd-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9dbkc\" (UID: \"32480919-8b56-4ecd-9904-1e7674fb05bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:45.191870 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.191868 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32480919-8b56-4ecd-9904-1e7674fb05bd-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9dbkc\" (UID: \"32480919-8b56-4ecd-9904-1e7674fb05bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:45.192169 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.191891 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r46gs\" (UniqueName: \"kubernetes.io/projected/32480919-8b56-4ecd-9904-1e7674fb05bd-kube-api-access-r46gs\") pod \"prometheus-operator-5676c8c784-9dbkc\" (UID: \"32480919-8b56-4ecd-9904-1e7674fb05bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:45.293250 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.293207 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/32480919-8b56-4ecd-9904-1e7674fb05bd-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9dbkc\" (UID: \"32480919-8b56-4ecd-9904-1e7674fb05bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:45.293445 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.293261 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32480919-8b56-4ecd-9904-1e7674fb05bd-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9dbkc\" (UID: \"32480919-8b56-4ecd-9904-1e7674fb05bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:45.293445 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.293286 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r46gs\" (UniqueName: \"kubernetes.io/projected/32480919-8b56-4ecd-9904-1e7674fb05bd-kube-api-access-r46gs\") pod \"prometheus-operator-5676c8c784-9dbkc\" (UID: \"32480919-8b56-4ecd-9904-1e7674fb05bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:45.293445 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.293332 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/32480919-8b56-4ecd-9904-1e7674fb05bd-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9dbkc\" (UID: \"32480919-8b56-4ecd-9904-1e7674fb05bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:45.294037 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.293999 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32480919-8b56-4ecd-9904-1e7674fb05bd-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9dbkc\" (UID: \"32480919-8b56-4ecd-9904-1e7674fb05bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:45.296093 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.296061 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/32480919-8b56-4ecd-9904-1e7674fb05bd-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9dbkc\" (UID: \"32480919-8b56-4ecd-9904-1e7674fb05bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:45.296372 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.296353 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/32480919-8b56-4ecd-9904-1e7674fb05bd-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9dbkc\" (UID: \"32480919-8b56-4ecd-9904-1e7674fb05bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:45.309815 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.309753 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r46gs\" (UniqueName: \"kubernetes.io/projected/32480919-8b56-4ecd-9904-1e7674fb05bd-kube-api-access-r46gs\") pod \"prometheus-operator-5676c8c784-9dbkc\" (UID: \"32480919-8b56-4ecd-9904-1e7674fb05bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:45.361204 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:45.361170 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" Apr 23 17:46:46.125557 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:46.125529 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9dbkc"] Apr 23 17:46:46.127845 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:46:46.127813 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32480919_8b56_4ecd_9904_1e7674fb05bd.slice/crio-03ee844005c65e15b9eb4b13d582229880464c429641ca4544ed6ed7d9cd238a WatchSource:0}: Error finding container 03ee844005c65e15b9eb4b13d582229880464c429641ca4544ed6ed7d9cd238a: Status 404 returned error can't find the container with id 03ee844005c65e15b9eb4b13d582229880464c429641ca4544ed6ed7d9cd238a Apr 23 17:46:46.666057 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:46.666011 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8cdd5bc7-hl5n8" event={"ID":"d6ac3c44-9ac1-4a86-89cc-81579d2d7746","Type":"ContainerStarted","Data":"803b0b9c8e3a1852412ed590424d322d127280927345254bee180198374d7498"} Apr 23 17:46:46.667365 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:46.667330 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" event={"ID":"32480919-8b56-4ecd-9904-1e7674fb05bd","Type":"ContainerStarted","Data":"03ee844005c65e15b9eb4b13d582229880464c429641ca4544ed6ed7d9cd238a"} Apr 23 17:46:46.669345 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:46.669322 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cl2hj" event={"ID":"c5a8d4ca-dd18-4b64-918c-68c8b1037169","Type":"ContainerStarted","Data":"e28e11597644c89f3161ec5f7ea89d217d0294b9b38151b11e64042c05105b01"} Apr 23 17:46:46.698350 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:46.698300 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8cdd5bc7-hl5n8" podStartSLOduration=1.63912298 podStartE2EDuration="4.698282533s" podCreationTimestamp="2026-04-23 17:46:42 +0000 UTC" firstStartedPulling="2026-04-23 17:46:42.934108995 +0000 UTC m=+321.683550775" lastFinishedPulling="2026-04-23 17:46:45.993268546 +0000 UTC m=+324.742710328" observedRunningTime="2026-04-23 17:46:46.696945814 +0000 UTC m=+325.446387651" watchObservedRunningTime="2026-04-23 17:46:46.698282533 +0000 UTC m=+325.447724334" Apr 23 17:46:47.674170 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:47.674127 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" event={"ID":"32480919-8b56-4ecd-9904-1e7674fb05bd","Type":"ContainerStarted","Data":"ea80e5a01ad3b85a4f1b62184a3bfaa48daf18bffcc25f465d5c85381ca31aab"} Apr 23 17:46:47.674170 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:47.674176 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" event={"ID":"32480919-8b56-4ecd-9904-1e7674fb05bd","Type":"ContainerStarted","Data":"40e6f85ccd245fbc5917e39f4683aa5355c83c926f029eb09743a9090da47ff6"} Apr 23 17:46:47.707310 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:47.707258 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-cl2hj" podStartSLOduration=2.806119495 podStartE2EDuration="5.707243027s" podCreationTimestamp="2026-04-23 17:46:42 +0000 UTC" firstStartedPulling="2026-04-23 17:46:43.092672795 +0000 UTC m=+321.842114574" lastFinishedPulling="2026-04-23 17:46:45.993796314 +0000 UTC m=+324.743238106" observedRunningTime="2026-04-23 17:46:46.725078928 +0000 UTC m=+325.474520731" watchObservedRunningTime="2026-04-23 17:46:47.707243027 +0000 UTC m=+326.456684828" Apr 23 17:46:49.827835 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.827782 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-9dbkc" podStartSLOduration=3.7542342140000002 podStartE2EDuration="4.827766426s" podCreationTimestamp="2026-04-23 17:46:45 +0000 UTC" firstStartedPulling="2026-04-23 17:46:46.129545297 +0000 UTC m=+324.878987076" lastFinishedPulling="2026-04-23 17:46:47.20307751 +0000 UTC m=+325.952519288" observedRunningTime="2026-04-23 17:46:47.707786515 +0000 UTC m=+326.457228330" watchObservedRunningTime="2026-04-23 17:46:49.827766426 +0000 UTC m=+328.577208227" Apr 23 17:46:49.828777 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.828758 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-d67bp"] Apr 23 17:46:49.831144 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.831128 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:49.835101 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.835077 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 17:46:49.835540 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.835513 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 17:46:49.835703 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.835687 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 17:46:49.835777 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.835747 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hb422\"" Apr 23 17:46:49.906834 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.906806 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-cms96"] Apr 23 17:46:49.909208 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.909190 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:49.915811 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.915793 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 17:46:49.916395 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.916377 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 17:46:49.916536 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.916520 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-xrgf7\"" Apr 23 17:46:49.924211 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.924191 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-cms96"] Apr 23 17:46:49.928445 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.928425 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-textfile\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:49.928544 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.928458 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:49.928544 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.928486 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2225c\" (UniqueName: \"kubernetes.io/projected/21c80999-7322-4870-ae8a-5f6957e51999-kube-api-access-2225c\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:49.928638 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.928560 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-tls\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:49.928638 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.928605 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21c80999-7322-4870-ae8a-5f6957e51999-sys\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:49.928638 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.928636 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-accelerators-collector-config\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:49.928752 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.928658 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/21c80999-7322-4870-ae8a-5f6957e51999-root\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:49.928752 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.928676 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-wtmp\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:49.928752 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:49.928715 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21c80999-7322-4870-ae8a-5f6957e51999-metrics-client-ca\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.029891 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.029853 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21c80999-7322-4870-ae8a-5f6957e51999-sys\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.030089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.029894 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d55cc26-3696-48b3-88f6-736b38316363-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-cms96\" (UID: \"2d55cc26-3696-48b3-88f6-736b38316363\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:50.030089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.029946 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-accelerators-collector-config\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.030089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.030003 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d55cc26-3696-48b3-88f6-736b38316363-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-cms96\" (UID: \"2d55cc26-3696-48b3-88f6-736b38316363\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:50.030089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.029956 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21c80999-7322-4870-ae8a-5f6957e51999-sys\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.030089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.030026 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/21c80999-7322-4870-ae8a-5f6957e51999-root\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.030089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.030060 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/21c80999-7322-4870-ae8a-5f6957e51999-root\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.030089 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.030073 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-wtmp\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.030452 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.030137 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21c80999-7322-4870-ae8a-5f6957e51999-metrics-client-ca\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.030452 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.030165 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-textfile\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.030452 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.030181 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.030452 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.030188 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-wtmp\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.030452 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.030204 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2225c\" (UniqueName: \"kubernetes.io/projected/21c80999-7322-4870-ae8a-5f6957e51999-kube-api-access-2225c\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.030452 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.030232 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfbql\" (UniqueName: \"kubernetes.io/projected/2d55cc26-3696-48b3-88f6-736b38316363-kube-api-access-kfbql\") pod \"openshift-state-metrics-9d44df66c-cms96\" (UID: \"2d55cc26-3696-48b3-88f6-736b38316363\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:50.030452 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.030276 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d55cc26-3696-48b3-88f6-736b38316363-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cms96\" (UID: \"2d55cc26-3696-48b3-88f6-736b38316363\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:50.030452 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.030306 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-tls\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.030759 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.030590 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-accelerators-collector-config\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.031215 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.031195 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-textfile\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.031439 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.031410 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21c80999-7322-4870-ae8a-5f6957e51999-metrics-client-ca\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.032680 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.032663 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.033187 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.033172 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21c80999-7322-4870-ae8a-5f6957e51999-node-exporter-tls\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.049069 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.049041 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2225c\" (UniqueName: \"kubernetes.io/projected/21c80999-7322-4870-ae8a-5f6957e51999-kube-api-access-2225c\") pod \"node-exporter-d67bp\" (UID: \"21c80999-7322-4870-ae8a-5f6957e51999\") " pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.131581 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.131503 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d55cc26-3696-48b3-88f6-736b38316363-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-cms96\" (UID: \"2d55cc26-3696-48b3-88f6-736b38316363\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:50.131581 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.131541 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d55cc26-3696-48b3-88f6-736b38316363-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-cms96\" (UID: \"2d55cc26-3696-48b3-88f6-736b38316363\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:50.131750 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.131586 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfbql\" (UniqueName: \"kubernetes.io/projected/2d55cc26-3696-48b3-88f6-736b38316363-kube-api-access-kfbql\") pod \"openshift-state-metrics-9d44df66c-cms96\" (UID: \"2d55cc26-3696-48b3-88f6-736b38316363\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:50.131750 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.131610 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d55cc26-3696-48b3-88f6-736b38316363-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cms96\" (UID: \"2d55cc26-3696-48b3-88f6-736b38316363\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:50.132322 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.132225 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d55cc26-3696-48b3-88f6-736b38316363-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-cms96\" (UID: \"2d55cc26-3696-48b3-88f6-736b38316363\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:50.134149 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.134127 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d55cc26-3696-48b3-88f6-736b38316363-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-cms96\" (UID: \"2d55cc26-3696-48b3-88f6-736b38316363\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:50.134231 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.134130 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d55cc26-3696-48b3-88f6-736b38316363-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cms96\" (UID: \"2d55cc26-3696-48b3-88f6-736b38316363\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:50.143040 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.143018 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-d67bp" Apr 23 17:46:50.146492 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.146473 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfbql\" (UniqueName: \"kubernetes.io/projected/2d55cc26-3696-48b3-88f6-736b38316363-kube-api-access-kfbql\") pod \"openshift-state-metrics-9d44df66c-cms96\" (UID: \"2d55cc26-3696-48b3-88f6-736b38316363\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:50.150879 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:46:50.150854 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21c80999_7322_4870_ae8a_5f6957e51999.slice/crio-25754643db44799b5992cf933c22b917fe212ab449722889ebb26b8a1b2787d4 WatchSource:0}: Error finding container 25754643db44799b5992cf933c22b917fe212ab449722889ebb26b8a1b2787d4: Status 404 returned error can't find the container with id 25754643db44799b5992cf933c22b917fe212ab449722889ebb26b8a1b2787d4 Apr 23 17:46:50.218672 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.218643 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" Apr 23 17:46:50.342404 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.342366 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-cms96"] Apr 23 17:46:50.345165 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:46:50.345129 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d55cc26_3696_48b3_88f6_736b38316363.slice/crio-be982c73dd844aa339683d3608682a7cde0c5f156912f935dab7e1fa7e58e5bb WatchSource:0}: Error finding container be982c73dd844aa339683d3608682a7cde0c5f156912f935dab7e1fa7e58e5bb: Status 404 returned error can't find the container with id be982c73dd844aa339683d3608682a7cde0c5f156912f935dab7e1fa7e58e5bb Apr 23 17:46:50.684358 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.684295 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d67bp" event={"ID":"21c80999-7322-4870-ae8a-5f6957e51999","Type":"ContainerStarted","Data":"25754643db44799b5992cf933c22b917fe212ab449722889ebb26b8a1b2787d4"} Apr 23 17:46:50.686442 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.686401 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" event={"ID":"2d55cc26-3696-48b3-88f6-736b38316363","Type":"ContainerStarted","Data":"21e85f23084a545e201dbb0b2fc0d8d1fa9e1409ca314ceaaafe73cd6bb8c026"} Apr 23 17:46:50.686442 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.686440 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" event={"ID":"2d55cc26-3696-48b3-88f6-736b38316363","Type":"ContainerStarted","Data":"345d60210a3a940b9c9b94dc48f9ceb2339586e649c2596f70506903dd130198"} Apr 23 17:46:50.686630 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:50.686453 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" event={"ID":"2d55cc26-3696-48b3-88f6-736b38316363","Type":"ContainerStarted","Data":"be982c73dd844aa339683d3608682a7cde0c5f156912f935dab7e1fa7e58e5bb"} Apr 23 17:46:51.692299 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:51.692263 2581 generic.go:358] "Generic (PLEG): container finished" podID="21c80999-7322-4870-ae8a-5f6957e51999" containerID="3c031b74102951ccdedfb3aee0a98dca914f4a14456ee26f177ec66821958df8" exitCode=0 Apr 23 17:46:51.692976 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:51.692360 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d67bp" event={"ID":"21c80999-7322-4870-ae8a-5f6957e51999","Type":"ContainerDied","Data":"3c031b74102951ccdedfb3aee0a98dca914f4a14456ee26f177ec66821958df8"} Apr 23 17:46:51.694131 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:51.694108 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" event={"ID":"2d55cc26-3696-48b3-88f6-736b38316363","Type":"ContainerStarted","Data":"0019a79bd9dc248261f92feb5c17c48d67ce7ead46b71173aa8d68385c5a3cdf"} Apr 23 17:46:51.752076 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:51.751974 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cms96" podStartSLOduration=1.926435255 podStartE2EDuration="2.751961106s" podCreationTimestamp="2026-04-23 17:46:49 +0000 UTC" firstStartedPulling="2026-04-23 17:46:50.457022846 +0000 UTC m=+329.206464625" lastFinishedPulling="2026-04-23 17:46:51.282548682 +0000 UTC m=+330.031990476" observedRunningTime="2026-04-23 17:46:51.751183585 +0000 UTC m=+330.500625397" watchObservedRunningTime="2026-04-23 17:46:51.751961106 +0000 UTC m=+330.501402959" Apr 23 17:46:52.698618 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:52.698571 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d67bp" event={"ID":"21c80999-7322-4870-ae8a-5f6957e51999","Type":"ContainerStarted","Data":"09d3280fdff68cd87e8ff0fb50d4eb3b78f9f37bbaf9c23e6f02de4888996caa"} Apr 23 17:46:52.698618 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:52.698617 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d67bp" event={"ID":"21c80999-7322-4870-ae8a-5f6957e51999","Type":"ContainerStarted","Data":"b4e23510319d47d4cc6b80f49424be7fb884136cb3f5c77352f812859a6afa28"} Apr 23 17:46:52.728534 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:52.728491 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-d67bp" podStartSLOduration=3.077210375 podStartE2EDuration="3.728475291s" podCreationTimestamp="2026-04-23 17:46:49 +0000 UTC" firstStartedPulling="2026-04-23 17:46:50.152579294 +0000 UTC m=+328.902021073" lastFinishedPulling="2026-04-23 17:46:50.803844208 +0000 UTC m=+329.553285989" observedRunningTime="2026-04-23 17:46:52.727350082 +0000 UTC m=+331.476791883" watchObservedRunningTime="2026-04-23 17:46:52.728475291 +0000 UTC m=+331.477917095" Apr 23 17:46:52.810651 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:52.810622 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:52.810839 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:52.810775 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:52.815634 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:52.815612 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:53.705232 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:53.705205 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:46:53.756614 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:53.756589 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:46:54.370169 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:54.370132 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr"] Apr 23 17:46:54.372349 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:54.372333 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr" Apr 23 17:46:54.375995 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:54.375956 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 17:46:54.375995 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:54.375968 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-clc87\"" Apr 23 17:46:54.403428 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:54.403399 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr"] Apr 23 17:46:54.464690 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:54.464656 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09b5f2cb-9870-4567-9f10-2c51e6e0ae4a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-j7zhr\" (UID: \"09b5f2cb-9870-4567-9f10-2c51e6e0ae4a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr" Apr 23 17:46:54.565918 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:54.565884 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09b5f2cb-9870-4567-9f10-2c51e6e0ae4a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-j7zhr\" (UID: \"09b5f2cb-9870-4567-9f10-2c51e6e0ae4a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr" Apr 23 17:46:54.566105 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:46:54.566048 2581 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 23 17:46:54.566150 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:46:54.566123 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b5f2cb-9870-4567-9f10-2c51e6e0ae4a-monitoring-plugin-cert podName:09b5f2cb-9870-4567-9f10-2c51e6e0ae4a nodeName:}" failed. No retries permitted until 2026-04-23 17:46:55.066104429 +0000 UTC m=+333.815546207 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/09b5f2cb-9870-4567-9f10-2c51e6e0ae4a-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-j7zhr" (UID: "09b5f2cb-9870-4567-9f10-2c51e6e0ae4a") : secret "monitoring-plugin-cert" not found Apr 23 17:46:55.069830 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.069770 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09b5f2cb-9870-4567-9f10-2c51e6e0ae4a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-j7zhr\" (UID: \"09b5f2cb-9870-4567-9f10-2c51e6e0ae4a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr" Apr 23 17:46:55.072736 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.072706 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09b5f2cb-9870-4567-9f10-2c51e6e0ae4a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-j7zhr\" (UID: \"09b5f2cb-9870-4567-9f10-2c51e6e0ae4a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr" Apr 23 17:46:55.194942 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.194893 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7df7f7d9c5-97595"] Apr 23 17:46:55.197364 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.197348 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.204207 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.204182 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 17:46:55.204312 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.204194 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 17:46:55.204506 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.204490 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 17:46:55.204601 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.204552 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 17:46:55.204724 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.204711 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 17:46:55.210018 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.209998 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-dddk9\"" Apr 23 17:46:55.220420 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.220396 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 17:46:55.221346 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.221326 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7df7f7d9c5-97595"] Apr 23 17:46:55.271330 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.271301 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/35919eee-76f1-461d-b532-97c823ac5955-secret-telemeter-client\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.271429 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.271341 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35919eee-76f1-461d-b532-97c823ac5955-serving-certs-ca-bundle\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.271429 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.271391 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35919eee-76f1-461d-b532-97c823ac5955-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.271429 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.271409 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35919eee-76f1-461d-b532-97c823ac5955-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.271535 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.271437 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4x6d\" (UniqueName: \"kubernetes.io/projected/35919eee-76f1-461d-b532-97c823ac5955-kube-api-access-c4x6d\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.271535 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.271502 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35919eee-76f1-461d-b532-97c823ac5955-metrics-client-ca\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.271535 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.271532 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/35919eee-76f1-461d-b532-97c823ac5955-telemeter-client-tls\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.271637 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.271561 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/35919eee-76f1-461d-b532-97c823ac5955-federate-client-tls\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.281816 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.281800 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr" Apr 23 17:46:55.372780 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.372746 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4x6d\" (UniqueName: \"kubernetes.io/projected/35919eee-76f1-461d-b532-97c823ac5955-kube-api-access-c4x6d\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.372780 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.372786 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35919eee-76f1-461d-b532-97c823ac5955-metrics-client-ca\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.373076 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.372805 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/35919eee-76f1-461d-b532-97c823ac5955-telemeter-client-tls\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.373076 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.372837 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/35919eee-76f1-461d-b532-97c823ac5955-federate-client-tls\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.373076 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.372869 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/35919eee-76f1-461d-b532-97c823ac5955-secret-telemeter-client\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.373076 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.372908 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35919eee-76f1-461d-b532-97c823ac5955-serving-certs-ca-bundle\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.373076 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.372954 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35919eee-76f1-461d-b532-97c823ac5955-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.373076 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.373000 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35919eee-76f1-461d-b532-97c823ac5955-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.373577 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.373544 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35919eee-76f1-461d-b532-97c823ac5955-metrics-client-ca\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.373889 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.373842 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35919eee-76f1-461d-b532-97c823ac5955-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.374015 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.373882 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35919eee-76f1-461d-b532-97c823ac5955-serving-certs-ca-bundle\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.376113 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.376086 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/35919eee-76f1-461d-b532-97c823ac5955-federate-client-tls\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.376314 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.376295 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/35919eee-76f1-461d-b532-97c823ac5955-telemeter-client-tls\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.376535 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.376518 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35919eee-76f1-461d-b532-97c823ac5955-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.376703 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.376679 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/35919eee-76f1-461d-b532-97c823ac5955-secret-telemeter-client\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.393845 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.393817 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4x6d\" (UniqueName: \"kubernetes.io/projected/35919eee-76f1-461d-b532-97c823ac5955-kube-api-access-c4x6d\") pod \"telemeter-client-7df7f7d9c5-97595\" (UID: \"35919eee-76f1-461d-b532-97c823ac5955\") " pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.424517 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.424496 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr"] Apr 23 17:46:55.427729 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:46:55.427709 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09b5f2cb_9870_4567_9f10_2c51e6e0ae4a.slice/crio-23153b2627496e07873fdde268884d377bdfe5ba0110552e762155b3ced0fe26 WatchSource:0}: Error finding container 23153b2627496e07873fdde268884d377bdfe5ba0110552e762155b3ced0fe26: Status 404 returned error can't find the container with id 23153b2627496e07873fdde268884d377bdfe5ba0110552e762155b3ced0fe26 Apr 23 17:46:55.507026 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.506998 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" Apr 23 17:46:55.639042 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.639009 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7df7f7d9c5-97595"] Apr 23 17:46:55.642405 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:46:55.642376 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35919eee_76f1_461d_b532_97c823ac5955.slice/crio-09d6c196788671dd2291248ea41d203133f82fd2ec0c84f07647827072cad910 WatchSource:0}: Error finding container 09d6c196788671dd2291248ea41d203133f82fd2ec0c84f07647827072cad910: Status 404 returned error can't find the container with id 09d6c196788671dd2291248ea41d203133f82fd2ec0c84f07647827072cad910 Apr 23 17:46:55.707537 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.707501 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr" event={"ID":"09b5f2cb-9870-4567-9f10-2c51e6e0ae4a","Type":"ContainerStarted","Data":"23153b2627496e07873fdde268884d377bdfe5ba0110552e762155b3ced0fe26"} Apr 23 17:46:55.708509 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:55.708484 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" event={"ID":"35919eee-76f1-461d-b532-97c823ac5955","Type":"ContainerStarted","Data":"09d6c196788671dd2291248ea41d203133f82fd2ec0c84f07647827072cad910"} Apr 23 17:46:56.089927 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.089301 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:46:56.092895 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.092862 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.100290 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.099925 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 17:46:56.100290 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.100207 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 17:46:56.100845 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.100822 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 17:46:56.101052 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.100862 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 17:46:56.101052 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.100965 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 17:46:56.101631 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.101458 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 17:46:56.101742 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.101653 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 17:46:56.101831 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.101794 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 17:46:56.101905 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.101854 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 17:46:56.102047 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.102028 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 17:46:56.102699 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.102629 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-n5blf\"" Apr 23 17:46:56.102789 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.102709 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-fs23asiu67t61\"" Apr 23 17:46:56.104509 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.104486 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 17:46:56.106382 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.106360 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 17:46:56.109023 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.109004 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 17:46:56.122331 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.122304 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:46:56.179333 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179247 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.179511 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179354 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/397ba0f6-e575-4e5b-926f-be25f031cbb8-config-out\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.179511 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179398 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-web-config\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.179511 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179424 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.179511 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179466 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.179721 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179564 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.179721 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179599 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.179721 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179631 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.179721 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179658 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-config\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.179721 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179682 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/397ba0f6-e575-4e5b-926f-be25f031cbb8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.179721 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179712 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8wtl\" (UniqueName: \"kubernetes.io/projected/397ba0f6-e575-4e5b-926f-be25f031cbb8-kube-api-access-k8wtl\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.180231 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179738 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.180231 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179778 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.180231 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179807 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.180231 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179836 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.180231 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179869 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.180231 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.179926 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.180231 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.180128 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.281543 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.281505 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.281748 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.281603 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.281748 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.281642 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.281748 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.281670 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-config\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.281748 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.281696 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/397ba0f6-e575-4e5b-926f-be25f031cbb8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.281748 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.281731 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8wtl\" (UniqueName: \"kubernetes.io/projected/397ba0f6-e575-4e5b-926f-be25f031cbb8-kube-api-access-k8wtl\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282052 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.281758 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282052 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.281801 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282052 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.281833 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282052 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.281867 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282052 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.281922 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282052 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.281964 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282052 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.282009 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282338 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.282077 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282338 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.282117 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/397ba0f6-e575-4e5b-926f-be25f031cbb8-config-out\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282338 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.282139 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-web-config\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282338 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.282169 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282338 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.282194 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282576 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.282448 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282576 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.282528 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.282902 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.282702 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.287341 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.284592 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.287341 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.284645 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.287341 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.285815 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.287341 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.286258 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-config\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.287341 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.287298 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.288560 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.288125 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.288560 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.288520 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/397ba0f6-e575-4e5b-926f-be25f031cbb8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.288844 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.288820 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-web-config\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.289412 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.289280 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.289412 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.289360 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.289819 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.289795 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.289916 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.289890 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/397ba0f6-e575-4e5b-926f-be25f031cbb8-config-out\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.290668 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.290623 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.292131 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.291635 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.294754 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.294713 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8wtl\" (UniqueName: \"kubernetes.io/projected/397ba0f6-e575-4e5b-926f-be25f031cbb8-kube-api-access-k8wtl\") pod \"prometheus-k8s-0\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:56.405815 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:56.405763 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:46:57.383135 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:57.383107 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:46:57.384266 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:46:57.384240 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod397ba0f6_e575_4e5b_926f_be25f031cbb8.slice/crio-a62f2dcf3f2bdcc3cfde494352fc69013b340ec4a9fb43108957045fe4d146ba WatchSource:0}: Error finding container a62f2dcf3f2bdcc3cfde494352fc69013b340ec4a9fb43108957045fe4d146ba: Status 404 returned error can't find the container with id a62f2dcf3f2bdcc3cfde494352fc69013b340ec4a9fb43108957045fe4d146ba Apr 23 17:46:57.721312 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:57.721272 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" event={"ID":"35919eee-76f1-461d-b532-97c823ac5955","Type":"ContainerStarted","Data":"c8f60a5ab07accf1a272c64091d6bfcf8b5dfa52b5ad5f4e284b3c3dade7d489"} Apr 23 17:46:57.722810 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:57.722785 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr" event={"ID":"09b5f2cb-9870-4567-9f10-2c51e6e0ae4a","Type":"ContainerStarted","Data":"05a5e6136907008178d726dd02eb9b6a65b7e85d8a8c348654c83414eb902c01"} Apr 23 17:46:57.723218 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:57.723175 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr" Apr 23 17:46:57.724150 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:57.724126 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerStarted","Data":"a62f2dcf3f2bdcc3cfde494352fc69013b340ec4a9fb43108957045fe4d146ba"} Apr 23 17:46:57.729102 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:57.729080 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr" Apr 23 17:46:57.745407 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:57.745347 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7zhr" podStartSLOduration=1.948955878 podStartE2EDuration="3.745328339s" podCreationTimestamp="2026-04-23 17:46:54 +0000 UTC" firstStartedPulling="2026-04-23 17:46:55.4295424 +0000 UTC m=+334.178984180" lastFinishedPulling="2026-04-23 17:46:57.225914854 +0000 UTC m=+335.975356641" observedRunningTime="2026-04-23 17:46:57.742830629 +0000 UTC m=+336.492272431" watchObservedRunningTime="2026-04-23 17:46:57.745328339 +0000 UTC m=+336.494770140" Apr 23 17:46:58.728217 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:58.728172 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" event={"ID":"35919eee-76f1-461d-b532-97c823ac5955","Type":"ContainerStarted","Data":"68408a78ac332f61a55d9e4b817b3fda965a2e160f65da9987cf88be86813c84"} Apr 23 17:46:58.728217 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:58.728219 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" event={"ID":"35919eee-76f1-461d-b532-97c823ac5955","Type":"ContainerStarted","Data":"a3f4982cc71460ae36ed22900e12e1dea268005eafc61905e199122bbe5db90d"} Apr 23 17:46:58.729476 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:58.729453 2581 generic.go:358] "Generic (PLEG): container finished" podID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerID="8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53" exitCode=0 Apr 23 17:46:58.729576 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:58.729532 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerDied","Data":"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53"} Apr 23 17:46:58.771099 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:58.771037 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7df7f7d9c5-97595" podStartSLOduration=1.15725543 podStartE2EDuration="3.771016309s" podCreationTimestamp="2026-04-23 17:46:55 +0000 UTC" firstStartedPulling="2026-04-23 17:46:55.646265271 +0000 UTC m=+334.395707054" lastFinishedPulling="2026-04-23 17:46:58.26002614 +0000 UTC m=+337.009467933" observedRunningTime="2026-04-23 17:46:58.768420075 +0000 UTC m=+337.517861888" watchObservedRunningTime="2026-04-23 17:46:58.771016309 +0000 UTC m=+337.520458110" Apr 23 17:46:59.658490 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.658452 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59db946bf4-rbpjd"] Apr 23 17:46:59.661090 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.661052 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.689031 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.688993 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59db946bf4-rbpjd"] Apr 23 17:46:59.816466 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.816421 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-console-config\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.816909 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.816473 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-oauth-serving-cert\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.816909 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.816521 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dafd0a5-3293-442f-b120-e2bffad48143-console-serving-cert\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.816909 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.816566 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-service-ca\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.816909 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.816593 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8dafd0a5-3293-442f-b120-e2bffad48143-console-oauth-config\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.816909 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.816623 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-trusted-ca-bundle\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.816909 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.816672 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxs2l\" (UniqueName: \"kubernetes.io/projected/8dafd0a5-3293-442f-b120-e2bffad48143-kube-api-access-jxs2l\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.917320 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.917229 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-console-config\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.917320 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.917287 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-oauth-serving-cert\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.917320 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.917318 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dafd0a5-3293-442f-b120-e2bffad48143-console-serving-cert\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.917582 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.917346 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-service-ca\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.917582 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.917372 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8dafd0a5-3293-442f-b120-e2bffad48143-console-oauth-config\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.917582 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.917400 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-trusted-ca-bundle\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.917582 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.917438 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxs2l\" (UniqueName: \"kubernetes.io/projected/8dafd0a5-3293-442f-b120-e2bffad48143-kube-api-access-jxs2l\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.918231 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.918176 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-console-config\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.918231 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.918220 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-service-ca\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.918394 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.918231 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-oauth-serving-cert\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.918485 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.918463 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-trusted-ca-bundle\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.920501 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.920471 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8dafd0a5-3293-442f-b120-e2bffad48143-console-oauth-config\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.920649 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.920625 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dafd0a5-3293-442f-b120-e2bffad48143-console-serving-cert\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.933384 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.933356 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxs2l\" (UniqueName: \"kubernetes.io/projected/8dafd0a5-3293-442f-b120-e2bffad48143-kube-api-access-jxs2l\") pod \"console-59db946bf4-rbpjd\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:46:59.972631 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:46:59.972596 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:47:00.141934 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:00.141900 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59db946bf4-rbpjd"] Apr 23 17:47:00.145179 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:47:00.145147 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dafd0a5_3293_442f_b120_e2bffad48143.slice/crio-486f149ce8dbc8068655185a29f7df0585c499f7eb6d22c8798bb2458ced99b0 WatchSource:0}: Error finding container 486f149ce8dbc8068655185a29f7df0585c499f7eb6d22c8798bb2458ced99b0: Status 404 returned error can't find the container with id 486f149ce8dbc8068655185a29f7df0585c499f7eb6d22c8798bb2458ced99b0 Apr 23 17:47:00.739802 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:00.739757 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59db946bf4-rbpjd" event={"ID":"8dafd0a5-3293-442f-b120-e2bffad48143","Type":"ContainerStarted","Data":"2ec3020916fed38da36390ee47a9cbcfe458d4e27bf8dc48f7300a2d0f416a35"} Apr 23 17:47:00.739802 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:00.739806 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59db946bf4-rbpjd" event={"ID":"8dafd0a5-3293-442f-b120-e2bffad48143","Type":"ContainerStarted","Data":"486f149ce8dbc8068655185a29f7df0585c499f7eb6d22c8798bb2458ced99b0"} Apr 23 17:47:00.765718 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:00.765650 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59db946bf4-rbpjd" podStartSLOduration=1.765630807 podStartE2EDuration="1.765630807s" podCreationTimestamp="2026-04-23 17:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:47:00.763501989 +0000 UTC m=+339.512943795" watchObservedRunningTime="2026-04-23 17:47:00.765630807 +0000 UTC m=+339.515072608" Apr 23 17:47:01.744804 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:01.744763 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerStarted","Data":"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445"} Apr 23 17:47:01.744804 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:01.744803 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerStarted","Data":"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c"} Apr 23 17:47:03.753414 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:03.753381 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerStarted","Data":"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96"} Apr 23 17:47:03.753414 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:03.753416 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerStarted","Data":"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9"} Apr 23 17:47:03.753920 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:03.753425 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerStarted","Data":"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860"} Apr 23 17:47:03.753920 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:03.753433 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerStarted","Data":"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed"} Apr 23 17:47:03.801148 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:03.801100 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.145990837 podStartE2EDuration="7.80108777s" podCreationTimestamp="2026-04-23 17:46:56 +0000 UTC" firstStartedPulling="2026-04-23 17:46:57.386226245 +0000 UTC m=+336.135668023" lastFinishedPulling="2026-04-23 17:47:03.041323174 +0000 UTC m=+341.790764956" observedRunningTime="2026-04-23 17:47:03.800254267 +0000 UTC m=+342.549696069" watchObservedRunningTime="2026-04-23 17:47:03.80108777 +0000 UTC m=+342.550529571" Apr 23 17:47:05.559749 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:05.559717 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-747847cfd6-g5xmr"] Apr 23 17:47:06.406264 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:06.406226 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:47:08.773747 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:08.773659 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" containerID="cri-o://eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2" gracePeriod=30 Apr 23 17:47:09.771544 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:09.771512 2581 generic.go:358] "Generic (PLEG): container finished" podID="a8560822-ef3b-47a7-b587-297593afaeeb" containerID="eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2" exitCode=0 Apr 23 17:47:09.771725 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:09.771555 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" event={"ID":"a8560822-ef3b-47a7-b587-297593afaeeb","Type":"ContainerDied","Data":"eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2"} Apr 23 17:47:09.771725 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:09.771578 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" event={"ID":"a8560822-ef3b-47a7-b587-297593afaeeb","Type":"ContainerStarted","Data":"7a1beac2c7c111fb62f3f029e16d99aa0c97d3fcd9c3b1870c645a229a416639"} Apr 23 17:47:09.771725 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:09.771592 2581 scope.go:117] "RemoveContainer" containerID="8a0502c66652199189f884ac2bd19b0d46ea53ef223d4da7ca4814be59529d04" Apr 23 17:47:09.771725 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:09.771680 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:47:09.973387 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:09.973350 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:47:09.973387 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:09.973396 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:47:09.978456 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:09.978432 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:47:10.780901 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:10.780872 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:47:10.846910 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:10.846883 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8cdd5bc7-hl5n8"] Apr 23 17:47:29.776181 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:29.776148 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:47:34.790946 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:34.790903 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" containerID="cri-o://7a1beac2c7c111fb62f3f029e16d99aa0c97d3fcd9c3b1870c645a229a416639" gracePeriod=30 Apr 23 17:47:35.045190 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.045135 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:47:35.130040 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.130009 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls\") pod \"a8560822-ef3b-47a7-b587-297593afaeeb\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " Apr 23 17:47:35.130184 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.130049 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5f8z\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-kube-api-access-l5f8z\") pod \"a8560822-ef3b-47a7-b587-297593afaeeb\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " Apr 23 17:47:35.130184 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.130070 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8560822-ef3b-47a7-b587-297593afaeeb-installation-pull-secrets\") pod \"a8560822-ef3b-47a7-b587-297593afaeeb\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " Apr 23 17:47:35.130184 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.130131 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a8560822-ef3b-47a7-b587-297593afaeeb-image-registry-private-configuration\") pod \"a8560822-ef3b-47a7-b587-297593afaeeb\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " Apr 23 17:47:35.130184 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.130157 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8560822-ef3b-47a7-b587-297593afaeeb-trusted-ca\") pod \"a8560822-ef3b-47a7-b587-297593afaeeb\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " Apr 23 17:47:35.130360 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.130201 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8560822-ef3b-47a7-b587-297593afaeeb-registry-certificates\") pod \"a8560822-ef3b-47a7-b587-297593afaeeb\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " Apr 23 17:47:35.130412 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.130361 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8560822-ef3b-47a7-b587-297593afaeeb-ca-trust-extracted\") pod \"a8560822-ef3b-47a7-b587-297593afaeeb\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " Apr 23 17:47:35.130463 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.130440 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-bound-sa-token\") pod \"a8560822-ef3b-47a7-b587-297593afaeeb\" (UID: \"a8560822-ef3b-47a7-b587-297593afaeeb\") " Apr 23 17:47:35.130678 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.130650 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8560822-ef3b-47a7-b587-297593afaeeb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a8560822-ef3b-47a7-b587-297593afaeeb" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:47:35.130804 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.130752 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8560822-ef3b-47a7-b587-297593afaeeb-trusted-ca\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:35.131289 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.131247 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8560822-ef3b-47a7-b587-297593afaeeb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a8560822-ef3b-47a7-b587-297593afaeeb" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:47:35.132817 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.132774 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-kube-api-access-l5f8z" (OuterVolumeSpecName: "kube-api-access-l5f8z") pod "a8560822-ef3b-47a7-b587-297593afaeeb" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb"). InnerVolumeSpecName "kube-api-access-l5f8z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:47:35.132817 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.132795 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a8560822-ef3b-47a7-b587-297593afaeeb" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:47:35.133071 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.132909 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8560822-ef3b-47a7-b587-297593afaeeb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a8560822-ef3b-47a7-b587-297593afaeeb" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:47:35.133264 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.133242 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a8560822-ef3b-47a7-b587-297593afaeeb" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:47:35.133331 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.133284 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8560822-ef3b-47a7-b587-297593afaeeb-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a8560822-ef3b-47a7-b587-297593afaeeb" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:47:35.139717 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.139692 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8560822-ef3b-47a7-b587-297593afaeeb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a8560822-ef3b-47a7-b587-297593afaeeb" (UID: "a8560822-ef3b-47a7-b587-297593afaeeb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:47:35.232144 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.232116 2581 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8560822-ef3b-47a7-b587-297593afaeeb-registry-certificates\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:35.232144 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.232143 2581 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8560822-ef3b-47a7-b587-297593afaeeb-ca-trust-extracted\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:35.232298 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.232153 2581 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-bound-sa-token\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:35.232298 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.232162 2581 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-registry-tls\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:35.232298 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.232171 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l5f8z\" (UniqueName: \"kubernetes.io/projected/a8560822-ef3b-47a7-b587-297593afaeeb-kube-api-access-l5f8z\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:35.232298 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.232179 2581 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8560822-ef3b-47a7-b587-297593afaeeb-installation-pull-secrets\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:35.232298 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.232190 2581 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a8560822-ef3b-47a7-b587-297593afaeeb-image-registry-private-configuration\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:35.847587 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.847557 2581 generic.go:358] "Generic (PLEG): container finished" podID="a8560822-ef3b-47a7-b587-297593afaeeb" containerID="7a1beac2c7c111fb62f3f029e16d99aa0c97d3fcd9c3b1870c645a229a416639" exitCode=0 Apr 23 17:47:35.847913 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.847614 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" Apr 23 17:47:35.847913 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.847644 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" event={"ID":"a8560822-ef3b-47a7-b587-297593afaeeb","Type":"ContainerDied","Data":"7a1beac2c7c111fb62f3f029e16d99aa0c97d3fcd9c3b1870c645a229a416639"} Apr 23 17:47:35.847913 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.847682 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747847cfd6-g5xmr" event={"ID":"a8560822-ef3b-47a7-b587-297593afaeeb","Type":"ContainerDied","Data":"49538decf0c47c508dc21301230c42459f488465c8cf81649a5346c250c80262"} Apr 23 17:47:35.847913 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.847701 2581 scope.go:117] "RemoveContainer" containerID="7a1beac2c7c111fb62f3f029e16d99aa0c97d3fcd9c3b1870c645a229a416639" Apr 23 17:47:35.855697 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.855679 2581 scope.go:117] "RemoveContainer" containerID="eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2" Apr 23 17:47:35.862922 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.862904 2581 scope.go:117] "RemoveContainer" containerID="7a1beac2c7c111fb62f3f029e16d99aa0c97d3fcd9c3b1870c645a229a416639" Apr 23 17:47:35.863185 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:47:35.863165 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a1beac2c7c111fb62f3f029e16d99aa0c97d3fcd9c3b1870c645a229a416639\": container with ID starting with 7a1beac2c7c111fb62f3f029e16d99aa0c97d3fcd9c3b1870c645a229a416639 not found: ID does not exist" containerID="7a1beac2c7c111fb62f3f029e16d99aa0c97d3fcd9c3b1870c645a229a416639" Apr 23 17:47:35.863250 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.863194 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a1beac2c7c111fb62f3f029e16d99aa0c97d3fcd9c3b1870c645a229a416639"} err="failed to get container status \"7a1beac2c7c111fb62f3f029e16d99aa0c97d3fcd9c3b1870c645a229a416639\": rpc error: code = NotFound desc = could not find container \"7a1beac2c7c111fb62f3f029e16d99aa0c97d3fcd9c3b1870c645a229a416639\": container with ID starting with 7a1beac2c7c111fb62f3f029e16d99aa0c97d3fcd9c3b1870c645a229a416639 not found: ID does not exist" Apr 23 17:47:35.863250 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.863213 2581 scope.go:117] "RemoveContainer" containerID="eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2" Apr 23 17:47:35.863446 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:47:35.863430 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2\": container with ID starting with eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2 not found: ID does not exist" containerID="eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2" Apr 23 17:47:35.863497 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.863452 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2"} err="failed to get container status \"eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2\": rpc error: code = NotFound desc = could not find container \"eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2\": container with ID starting with eb68734b136ab069a5fb71c9a98855afa47a9d5e2b3d9f0b5fa712b3fa32bdc2 not found: ID does not exist" Apr 23 17:47:35.869393 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.869371 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-747847cfd6-g5xmr"] Apr 23 17:47:35.869782 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.869736 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8cdd5bc7-hl5n8" podUID="d6ac3c44-9ac1-4a86-89cc-81579d2d7746" containerName="console" containerID="cri-o://803b0b9c8e3a1852412ed590424d322d127280927345254bee180198374d7498" gracePeriod=15 Apr 23 17:47:35.873154 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:35.873130 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-747847cfd6-g5xmr"] Apr 23 17:47:36.125236 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.125211 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8cdd5bc7-hl5n8_d6ac3c44-9ac1-4a86-89cc-81579d2d7746/console/0.log" Apr 23 17:47:36.125363 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.125280 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:47:36.241392 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.241351 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg4nj\" (UniqueName: \"kubernetes.io/projected/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-kube-api-access-wg4nj\") pod \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " Apr 23 17:47:36.241392 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.241400 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-trusted-ca-bundle\") pod \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " Apr 23 17:47:36.241642 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.241425 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-serving-cert\") pod \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " Apr 23 17:47:36.241642 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.241446 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-oauth-config\") pod \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " Apr 23 17:47:36.241642 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.241464 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-config\") pod \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " Apr 23 17:47:36.241642 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.241502 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-oauth-serving-cert\") pod \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " Apr 23 17:47:36.241642 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.241598 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-service-ca\") pod \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\" (UID: \"d6ac3c44-9ac1-4a86-89cc-81579d2d7746\") " Apr 23 17:47:36.242285 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.242253 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d6ac3c44-9ac1-4a86-89cc-81579d2d7746" (UID: "d6ac3c44-9ac1-4a86-89cc-81579d2d7746"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:47:36.242380 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.242299 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-service-ca" (OuterVolumeSpecName: "service-ca") pod "d6ac3c44-9ac1-4a86-89cc-81579d2d7746" (UID: "d6ac3c44-9ac1-4a86-89cc-81579d2d7746"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:47:36.242380 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.242348 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-config" (OuterVolumeSpecName: "console-config") pod "d6ac3c44-9ac1-4a86-89cc-81579d2d7746" (UID: "d6ac3c44-9ac1-4a86-89cc-81579d2d7746"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:47:36.242457 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.242348 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d6ac3c44-9ac1-4a86-89cc-81579d2d7746" (UID: "d6ac3c44-9ac1-4a86-89cc-81579d2d7746"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:47:36.243843 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.243817 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d6ac3c44-9ac1-4a86-89cc-81579d2d7746" (UID: "d6ac3c44-9ac1-4a86-89cc-81579d2d7746"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:47:36.243950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.243847 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d6ac3c44-9ac1-4a86-89cc-81579d2d7746" (UID: "d6ac3c44-9ac1-4a86-89cc-81579d2d7746"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:47:36.243950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.243922 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-kube-api-access-wg4nj" (OuterVolumeSpecName: "kube-api-access-wg4nj") pod "d6ac3c44-9ac1-4a86-89cc-81579d2d7746" (UID: "d6ac3c44-9ac1-4a86-89cc-81579d2d7746"). InnerVolumeSpecName "kube-api-access-wg4nj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:47:36.342481 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.342443 2581 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-oauth-config\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:36.342481 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.342476 2581 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-config\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:36.342670 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.342491 2581 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-oauth-serving-cert\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:36.342670 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.342504 2581 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-service-ca\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:36.342670 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.342516 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wg4nj\" (UniqueName: \"kubernetes.io/projected/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-kube-api-access-wg4nj\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:36.342670 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.342527 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-trusted-ca-bundle\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:36.342670 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.342539 2581 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ac3c44-9ac1-4a86-89cc-81579d2d7746-console-serving-cert\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:47:36.853978 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.853210 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8cdd5bc7-hl5n8_d6ac3c44-9ac1-4a86-89cc-81579d2d7746/console/0.log" Apr 23 17:47:36.853978 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.853253 2581 generic.go:358] "Generic (PLEG): container finished" podID="d6ac3c44-9ac1-4a86-89cc-81579d2d7746" containerID="803b0b9c8e3a1852412ed590424d322d127280927345254bee180198374d7498" exitCode=2 Apr 23 17:47:36.853978 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.853318 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8cdd5bc7-hl5n8" event={"ID":"d6ac3c44-9ac1-4a86-89cc-81579d2d7746","Type":"ContainerDied","Data":"803b0b9c8e3a1852412ed590424d322d127280927345254bee180198374d7498"} Apr 23 17:47:36.853978 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.853344 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8cdd5bc7-hl5n8" event={"ID":"d6ac3c44-9ac1-4a86-89cc-81579d2d7746","Type":"ContainerDied","Data":"7ab61c2045eb680b1e35e799dc0fb306063d822e60358d756ba9ff9b6f2bd810"} Apr 23 17:47:36.853978 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.853363 2581 scope.go:117] "RemoveContainer" containerID="803b0b9c8e3a1852412ed590424d322d127280927345254bee180198374d7498" Apr 23 17:47:36.853978 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.853530 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8cdd5bc7-hl5n8" Apr 23 17:47:36.862217 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.862197 2581 scope.go:117] "RemoveContainer" containerID="803b0b9c8e3a1852412ed590424d322d127280927345254bee180198374d7498" Apr 23 17:47:36.862470 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:47:36.862451 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"803b0b9c8e3a1852412ed590424d322d127280927345254bee180198374d7498\": container with ID starting with 803b0b9c8e3a1852412ed590424d322d127280927345254bee180198374d7498 not found: ID does not exist" containerID="803b0b9c8e3a1852412ed590424d322d127280927345254bee180198374d7498" Apr 23 17:47:36.862515 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.862480 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"803b0b9c8e3a1852412ed590424d322d127280927345254bee180198374d7498"} err="failed to get container status \"803b0b9c8e3a1852412ed590424d322d127280927345254bee180198374d7498\": rpc error: code = NotFound desc = could not find container \"803b0b9c8e3a1852412ed590424d322d127280927345254bee180198374d7498\": container with ID starting with 803b0b9c8e3a1852412ed590424d322d127280927345254bee180198374d7498 not found: ID does not exist" Apr 23 17:47:36.878793 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.878752 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8cdd5bc7-hl5n8"] Apr 23 17:47:36.883689 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:36.883670 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8cdd5bc7-hl5n8"] Apr 23 17:47:37.790812 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:37.790767 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" path="/var/lib/kubelet/pods/a8560822-ef3b-47a7-b587-297593afaeeb/volumes" Apr 23 17:47:37.791258 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:37.791238 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ac3c44-9ac1-4a86-89cc-81579d2d7746" path="/var/lib/kubelet/pods/d6ac3c44-9ac1-4a86-89cc-81579d2d7746/volumes" Apr 23 17:47:56.406331 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:56.406277 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:47:56.426192 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:56.426166 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:47:56.925830 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:47:56.925804 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:14.595829 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.595792 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:48:14.596591 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.596292 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="prometheus" containerID="cri-o://90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c" gracePeriod=600 Apr 23 17:48:14.596591 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.596330 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="kube-rbac-proxy" containerID="cri-o://813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9" gracePeriod=600 Apr 23 17:48:14.596591 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.596345 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="thanos-sidecar" containerID="cri-o://b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed" gracePeriod=600 Apr 23 17:48:14.596591 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.596372 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="config-reloader" containerID="cri-o://23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445" gracePeriod=600 Apr 23 17:48:14.596591 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.596434 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="kube-rbac-proxy-web" containerID="cri-o://c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860" gracePeriod=600 Apr 23 17:48:14.596591 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.596441 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="kube-rbac-proxy-thanos" containerID="cri-o://118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96" gracePeriod=600 Apr 23 17:48:14.864155 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.864126 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:14.944555 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944521 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-serving-certs-ca-bundle\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.944555 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944566 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-config\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.944800 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944605 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-grpc-tls\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.944800 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944637 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-k8s-rulefiles-0\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.944800 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944665 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-k8s-db\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.944800 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944690 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.944800 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944726 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-kubelet-serving-ca-bundle\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.944800 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944753 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8wtl\" (UniqueName: \"kubernetes.io/projected/397ba0f6-e575-4e5b-926f-be25f031cbb8-kube-api-access-k8wtl\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.944800 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944785 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.945183 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944811 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-tls\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.945183 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944850 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-web-config\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.945183 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944887 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-metrics-client-ca\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.945183 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944915 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/397ba0f6-e575-4e5b-926f-be25f031cbb8-tls-assets\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.945183 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944947 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/397ba0f6-e575-4e5b-926f-be25f031cbb8-config-out\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.945183 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944974 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-metrics-client-certs\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.945183 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.945026 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-thanos-prometheus-http-client-file\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.945183 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.945083 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-trusted-ca-bundle\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.945183 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.945116 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-kube-rbac-proxy\") pod \"397ba0f6-e575-4e5b-926f-be25f031cbb8\" (UID: \"397ba0f6-e575-4e5b-926f-be25f031cbb8\") " Apr 23 17:48:14.947226 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.944906 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:48:14.947226 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.945204 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:48:14.947226 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.946365 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:48:14.947226 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.946708 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:48:14.947582 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.947366 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:48:14.948333 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.948034 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:48:14.948333 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.948100 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-config" (OuterVolumeSpecName: "config") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:48:14.948657 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.948459 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/397ba0f6-e575-4e5b-926f-be25f031cbb8-kube-api-access-k8wtl" (OuterVolumeSpecName: "kube-api-access-k8wtl") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "kube-api-access-k8wtl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:48:14.948657 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.948678 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:48:14.948860 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.948728 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:48:14.949262 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.949233 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:48:14.949759 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.949732 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:48:14.950187 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.950169 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/397ba0f6-e575-4e5b-926f-be25f031cbb8-config-out" (OuterVolumeSpecName: "config-out") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:48:14.950532 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.950510 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:48:14.950717 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.950698 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/397ba0f6-e575-4e5b-926f-be25f031cbb8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:48:14.950826 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.950811 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:48:14.950887 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.950875 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:48:14.960669 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.960644 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-web-config" (OuterVolumeSpecName: "web-config") pod "397ba0f6-e575-4e5b-926f-be25f031cbb8" (UID: "397ba0f6-e575-4e5b-926f-be25f031cbb8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:48:14.964093 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964061 2581 generic.go:358] "Generic (PLEG): container finished" podID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerID="118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96" exitCode=0 Apr 23 17:48:14.964093 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964091 2581 generic.go:358] "Generic (PLEG): container finished" podID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerID="813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9" exitCode=0 Apr 23 17:48:14.964239 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964101 2581 generic.go:358] "Generic (PLEG): container finished" podID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerID="c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860" exitCode=0 Apr 23 17:48:14.964239 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964110 2581 generic.go:358] "Generic (PLEG): container finished" podID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerID="b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed" exitCode=0 Apr 23 17:48:14.964239 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964118 2581 generic.go:358] "Generic (PLEG): container finished" podID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerID="23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445" exitCode=0 Apr 23 17:48:14.964239 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964126 2581 generic.go:358] "Generic (PLEG): container finished" podID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerID="90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c" exitCode=0 Apr 23 17:48:14.964239 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964142 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerDied","Data":"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96"} Apr 23 17:48:14.964239 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964189 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerDied","Data":"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9"} Apr 23 17:48:14.964239 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964192 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:14.964239 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964207 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerDied","Data":"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860"} Apr 23 17:48:14.964239 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964223 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerDied","Data":"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed"} Apr 23 17:48:14.964239 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964237 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerDied","Data":"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445"} Apr 23 17:48:14.964547 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964248 2581 scope.go:117] "RemoveContainer" containerID="118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96" Apr 23 17:48:14.964547 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964250 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerDied","Data":"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c"} Apr 23 17:48:14.964547 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.964264 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"397ba0f6-e575-4e5b-926f-be25f031cbb8","Type":"ContainerDied","Data":"a62f2dcf3f2bdcc3cfde494352fc69013b340ec4a9fb43108957045fe4d146ba"} Apr 23 17:48:14.971631 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.971608 2581 scope.go:117] "RemoveContainer" containerID="813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9" Apr 23 17:48:14.978943 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.978924 2581 scope.go:117] "RemoveContainer" containerID="c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860" Apr 23 17:48:14.985363 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.985344 2581 scope.go:117] "RemoveContainer" containerID="b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed" Apr 23 17:48:14.990837 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.990808 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:48:14.992810 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.992793 2581 scope.go:117] "RemoveContainer" containerID="23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445" Apr 23 17:48:14.994877 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:14.994854 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:48:15.002037 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.002021 2581 scope.go:117] "RemoveContainer" containerID="90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c" Apr 23 17:48:15.009180 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.009161 2581 scope.go:117] "RemoveContainer" containerID="8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53" Apr 23 17:48:15.015390 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.015368 2581 scope.go:117] "RemoveContainer" containerID="118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96" Apr 23 17:48:15.015620 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:48:15.015601 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96\": container with ID starting with 118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96 not found: ID does not exist" containerID="118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96" Apr 23 17:48:15.015697 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.015626 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96"} err="failed to get container status \"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96\": rpc error: code = NotFound desc = could not find container \"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96\": container with ID starting with 118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96 not found: ID does not exist" Apr 23 17:48:15.015697 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.015644 2581 scope.go:117] "RemoveContainer" containerID="813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9" Apr 23 17:48:15.015882 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:48:15.015867 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9\": container with ID starting with 813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9 not found: ID does not exist" containerID="813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9" Apr 23 17:48:15.015925 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.015886 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9"} err="failed to get container status \"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9\": rpc error: code = NotFound desc = could not find container \"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9\": container with ID starting with 813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9 not found: ID does not exist" Apr 23 17:48:15.015925 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.015900 2581 scope.go:117] "RemoveContainer" containerID="c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860" Apr 23 17:48:15.016227 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:48:15.016206 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860\": container with ID starting with c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860 not found: ID does not exist" containerID="c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860" Apr 23 17:48:15.016305 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.016231 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860"} err="failed to get container status \"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860\": rpc error: code = NotFound desc = could not find container \"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860\": container with ID starting with c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860 not found: ID does not exist" Apr 23 17:48:15.016305 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.016247 2581 scope.go:117] "RemoveContainer" containerID="b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed" Apr 23 17:48:15.016456 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:48:15.016441 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed\": container with ID starting with b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed not found: ID does not exist" containerID="b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed" Apr 23 17:48:15.016494 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.016463 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed"} err="failed to get container status \"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed\": rpc error: code = NotFound desc = could not find container \"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed\": container with ID starting with b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed not found: ID does not exist" Apr 23 17:48:15.016494 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.016476 2581 scope.go:117] "RemoveContainer" containerID="23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445" Apr 23 17:48:15.016703 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:48:15.016688 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445\": container with ID starting with 23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445 not found: ID does not exist" containerID="23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445" Apr 23 17:48:15.016754 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.016709 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445"} err="failed to get container status \"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445\": rpc error: code = NotFound desc = could not find container \"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445\": container with ID starting with 23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445 not found: ID does not exist" Apr 23 17:48:15.016754 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.016724 2581 scope.go:117] "RemoveContainer" containerID="90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c" Apr 23 17:48:15.016941 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:48:15.016927 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c\": container with ID starting with 90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c not found: ID does not exist" containerID="90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c" Apr 23 17:48:15.017010 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.016943 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c"} err="failed to get container status \"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c\": rpc error: code = NotFound desc = could not find container \"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c\": container with ID starting with 90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c not found: ID does not exist" Apr 23 17:48:15.017010 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.016955 2581 scope.go:117] "RemoveContainer" containerID="8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53" Apr 23 17:48:15.017135 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:48:15.017122 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53\": container with ID starting with 8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53 not found: ID does not exist" containerID="8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53" Apr 23 17:48:15.017178 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.017138 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53"} err="failed to get container status \"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53\": rpc error: code = NotFound desc = could not find container \"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53\": container with ID starting with 8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53 not found: ID does not exist" Apr 23 17:48:15.017178 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.017148 2581 scope.go:117] "RemoveContainer" containerID="118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96" Apr 23 17:48:15.017326 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.017309 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96"} err="failed to get container status \"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96\": rpc error: code = NotFound desc = could not find container \"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96\": container with ID starting with 118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96 not found: ID does not exist" Apr 23 17:48:15.017368 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.017326 2581 scope.go:117] "RemoveContainer" containerID="813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9" Apr 23 17:48:15.017516 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.017501 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9"} err="failed to get container status \"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9\": rpc error: code = NotFound desc = could not find container \"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9\": container with ID starting with 813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9 not found: ID does not exist" Apr 23 17:48:15.017565 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.017516 2581 scope.go:117] "RemoveContainer" containerID="c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860" Apr 23 17:48:15.017730 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.017716 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860"} err="failed to get container status \"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860\": rpc error: code = NotFound desc = could not find container \"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860\": container with ID starting with c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860 not found: ID does not exist" Apr 23 17:48:15.017775 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.017730 2581 scope.go:117] "RemoveContainer" containerID="b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed" Apr 23 17:48:15.017917 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.017903 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed"} err="failed to get container status \"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed\": rpc error: code = NotFound desc = could not find container \"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed\": container with ID starting with b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed not found: ID does not exist" Apr 23 17:48:15.017963 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.017917 2581 scope.go:117] "RemoveContainer" containerID="23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445" Apr 23 17:48:15.018337 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.018322 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445"} err="failed to get container status \"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445\": rpc error: code = NotFound desc = could not find container \"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445\": container with ID starting with 23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445 not found: ID does not exist" Apr 23 17:48:15.018394 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.018337 2581 scope.go:117] "RemoveContainer" containerID="90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c" Apr 23 17:48:15.018497 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.018483 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c"} err="failed to get container status \"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c\": rpc error: code = NotFound desc = could not find container \"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c\": container with ID starting with 90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c not found: ID does not exist" Apr 23 17:48:15.018537 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.018497 2581 scope.go:117] "RemoveContainer" containerID="8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53" Apr 23 17:48:15.018718 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.018700 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53"} err="failed to get container status \"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53\": rpc error: code = NotFound desc = could not find container \"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53\": container with ID starting with 8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53 not found: ID does not exist" Apr 23 17:48:15.018775 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.018719 2581 scope.go:117] "RemoveContainer" containerID="118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96" Apr 23 17:48:15.018922 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.018905 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96"} err="failed to get container status \"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96\": rpc error: code = NotFound desc = could not find container \"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96\": container with ID starting with 118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96 not found: ID does not exist" Apr 23 17:48:15.018922 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.018922 2581 scope.go:117] "RemoveContainer" containerID="813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9" Apr 23 17:48:15.019143 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.019127 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9"} err="failed to get container status \"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9\": rpc error: code = NotFound desc = could not find container \"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9\": container with ID starting with 813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9 not found: ID does not exist" Apr 23 17:48:15.019189 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.019143 2581 scope.go:117] "RemoveContainer" containerID="c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860" Apr 23 17:48:15.019338 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.019322 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860"} err="failed to get container status \"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860\": rpc error: code = NotFound desc = could not find container \"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860\": container with ID starting with c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860 not found: ID does not exist" Apr 23 17:48:15.019378 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.019338 2581 scope.go:117] "RemoveContainer" containerID="b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed" Apr 23 17:48:15.019509 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.019495 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed"} err="failed to get container status \"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed\": rpc error: code = NotFound desc = could not find container \"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed\": container with ID starting with b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed not found: ID does not exist" Apr 23 17:48:15.019550 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.019511 2581 scope.go:117] "RemoveContainer" containerID="23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445" Apr 23 17:48:15.019719 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.019698 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445"} err="failed to get container status \"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445\": rpc error: code = NotFound desc = could not find container \"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445\": container with ID starting with 23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445 not found: ID does not exist" Apr 23 17:48:15.019768 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.019719 2581 scope.go:117] "RemoveContainer" containerID="90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c" Apr 23 17:48:15.019910 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.019884 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c"} err="failed to get container status \"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c\": rpc error: code = NotFound desc = could not find container \"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c\": container with ID starting with 90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c not found: ID does not exist" Apr 23 17:48:15.019955 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.019910 2581 scope.go:117] "RemoveContainer" containerID="8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53" Apr 23 17:48:15.020165 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.020148 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53"} err="failed to get container status \"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53\": rpc error: code = NotFound desc = could not find container \"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53\": container with ID starting with 8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53 not found: ID does not exist" Apr 23 17:48:15.020215 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.020166 2581 scope.go:117] "RemoveContainer" containerID="118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96" Apr 23 17:48:15.020366 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.020350 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96"} err="failed to get container status \"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96\": rpc error: code = NotFound desc = could not find container \"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96\": container with ID starting with 118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96 not found: ID does not exist" Apr 23 17:48:15.020410 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.020367 2581 scope.go:117] "RemoveContainer" containerID="813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9" Apr 23 17:48:15.020559 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.020521 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9"} err="failed to get container status \"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9\": rpc error: code = NotFound desc = could not find container \"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9\": container with ID starting with 813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9 not found: ID does not exist" Apr 23 17:48:15.020559 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.020559 2581 scope.go:117] "RemoveContainer" containerID="c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860" Apr 23 17:48:15.020783 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.020765 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860"} err="failed to get container status \"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860\": rpc error: code = NotFound desc = could not find container \"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860\": container with ID starting with c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860 not found: ID does not exist" Apr 23 17:48:15.020831 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.020783 2581 scope.go:117] "RemoveContainer" containerID="b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed" Apr 23 17:48:15.021016 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.020975 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed"} err="failed to get container status \"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed\": rpc error: code = NotFound desc = could not find container \"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed\": container with ID starting with b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed not found: ID does not exist" Apr 23 17:48:15.021016 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.021004 2581 scope.go:117] "RemoveContainer" containerID="23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445" Apr 23 17:48:15.021209 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.021195 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445"} err="failed to get container status \"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445\": rpc error: code = NotFound desc = could not find container \"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445\": container with ID starting with 23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445 not found: ID does not exist" Apr 23 17:48:15.021209 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.021209 2581 scope.go:117] "RemoveContainer" containerID="90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c" Apr 23 17:48:15.021423 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.021404 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c"} err="failed to get container status \"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c\": rpc error: code = NotFound desc = could not find container \"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c\": container with ID starting with 90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c not found: ID does not exist" Apr 23 17:48:15.021470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.021425 2581 scope.go:117] "RemoveContainer" containerID="8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53" Apr 23 17:48:15.021624 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.021609 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53"} err="failed to get container status \"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53\": rpc error: code = NotFound desc = could not find container \"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53\": container with ID starting with 8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53 not found: ID does not exist" Apr 23 17:48:15.021669 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.021624 2581 scope.go:117] "RemoveContainer" containerID="118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96" Apr 23 17:48:15.021818 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.021804 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96"} err="failed to get container status \"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96\": rpc error: code = NotFound desc = could not find container \"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96\": container with ID starting with 118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96 not found: ID does not exist" Apr 23 17:48:15.021818 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.021817 2581 scope.go:117] "RemoveContainer" containerID="813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9" Apr 23 17:48:15.022051 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.022032 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9"} err="failed to get container status \"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9\": rpc error: code = NotFound desc = could not find container \"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9\": container with ID starting with 813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9 not found: ID does not exist" Apr 23 17:48:15.022115 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.022052 2581 scope.go:117] "RemoveContainer" containerID="c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860" Apr 23 17:48:15.022427 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.022303 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860"} err="failed to get container status \"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860\": rpc error: code = NotFound desc = could not find container \"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860\": container with ID starting with c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860 not found: ID does not exist" Apr 23 17:48:15.022427 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.022329 2581 scope.go:117] "RemoveContainer" containerID="b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed" Apr 23 17:48:15.022554 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.022534 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed"} err="failed to get container status \"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed\": rpc error: code = NotFound desc = could not find container \"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed\": container with ID starting with b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed not found: ID does not exist" Apr 23 17:48:15.022554 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.022551 2581 scope.go:117] "RemoveContainer" containerID="23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445" Apr 23 17:48:15.022772 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.022752 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445"} err="failed to get container status \"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445\": rpc error: code = NotFound desc = could not find container \"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445\": container with ID starting with 23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445 not found: ID does not exist" Apr 23 17:48:15.022819 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.022774 2581 scope.go:117] "RemoveContainer" containerID="90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c" Apr 23 17:48:15.022997 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.022965 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c"} err="failed to get container status \"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c\": rpc error: code = NotFound desc = could not find container \"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c\": container with ID starting with 90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c not found: ID does not exist" Apr 23 17:48:15.023044 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.023004 2581 scope.go:117] "RemoveContainer" containerID="8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53" Apr 23 17:48:15.023200 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.023184 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53"} err="failed to get container status \"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53\": rpc error: code = NotFound desc = could not find container \"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53\": container with ID starting with 8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53 not found: ID does not exist" Apr 23 17:48:15.023243 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.023202 2581 scope.go:117] "RemoveContainer" containerID="118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96" Apr 23 17:48:15.023378 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.023363 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96"} err="failed to get container status \"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96\": rpc error: code = NotFound desc = could not find container \"118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96\": container with ID starting with 118cc4dea66d0ac4c9a44179663156873ab6f00b4f0f527d537bef5601ce6a96 not found: ID does not exist" Apr 23 17:48:15.023427 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.023377 2581 scope.go:117] "RemoveContainer" containerID="813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9" Apr 23 17:48:15.023572 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.023553 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9"} err="failed to get container status \"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9\": rpc error: code = NotFound desc = could not find container \"813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9\": container with ID starting with 813e4316af6c8dbf936513cff2a17e3d97c82cba20571618e91389a073dd1fc9 not found: ID does not exist" Apr 23 17:48:15.023617 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.023572 2581 scope.go:117] "RemoveContainer" containerID="c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860" Apr 23 17:48:15.023748 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.023732 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860"} err="failed to get container status \"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860\": rpc error: code = NotFound desc = could not find container \"c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860\": container with ID starting with c370500088f873126f196762028bba6d50825fe9cc267c5a07b84a13cf786860 not found: ID does not exist" Apr 23 17:48:15.023788 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.023749 2581 scope.go:117] "RemoveContainer" containerID="b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed" Apr 23 17:48:15.023936 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.023921 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed"} err="failed to get container status \"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed\": rpc error: code = NotFound desc = could not find container \"b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed\": container with ID starting with b50ecc7dc25deb929bdef9c72f3962ca8f7edf73542eac4241cb8a05ea99e8ed not found: ID does not exist" Apr 23 17:48:15.023995 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.023937 2581 scope.go:117] "RemoveContainer" containerID="23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445" Apr 23 17:48:15.024165 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.024141 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445"} err="failed to get container status \"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445\": rpc error: code = NotFound desc = could not find container \"23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445\": container with ID starting with 23b5b7da6c94db7b5dc3d5a157e1d9c0701c30ae8c351950164a0dfb2a544445 not found: ID does not exist" Apr 23 17:48:15.024221 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.024165 2581 scope.go:117] "RemoveContainer" containerID="90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c" Apr 23 17:48:15.024330 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.024315 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c"} err="failed to get container status \"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c\": rpc error: code = NotFound desc = could not find container \"90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c\": container with ID starting with 90ac80226e39cd08592a77a86da4e47a47f2101072d765678ed20f2673beee4c not found: ID does not exist" Apr 23 17:48:15.024380 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.024330 2581 scope.go:117] "RemoveContainer" containerID="8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53" Apr 23 17:48:15.024503 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.024486 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53"} err="failed to get container status \"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53\": rpc error: code = NotFound desc = could not find container \"8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53\": container with ID starting with 8f70633f933223abfc827532770a572a9cf8e23f48287e90744e557d3d7e7e53 not found: ID does not exist" Apr 23 17:48:15.031950 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.031930 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:48:15.032239 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032227 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="kube-rbac-proxy-thanos" Apr 23 17:48:15.032283 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032242 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="kube-rbac-proxy-thanos" Apr 23 17:48:15.032283 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032254 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="prometheus" Apr 23 17:48:15.032283 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032259 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="prometheus" Apr 23 17:48:15.032283 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032265 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="kube-rbac-proxy-web" Apr 23 17:48:15.032283 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032271 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="kube-rbac-proxy-web" Apr 23 17:48:15.032283 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032278 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="config-reloader" Apr 23 17:48:15.032283 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032283 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="config-reloader" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032291 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032296 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032303 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="init-config-reloader" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032308 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="init-config-reloader" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032314 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032319 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032328 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6ac3c44-9ac1-4a86-89cc-81579d2d7746" containerName="console" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032333 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ac3c44-9ac1-4a86-89cc-81579d2d7746" containerName="console" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032340 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="kube-rbac-proxy" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032344 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="kube-rbac-proxy" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032349 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032354 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032359 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="thanos-sidecar" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032363 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="thanos-sidecar" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032369 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032373 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032412 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="kube-rbac-proxy" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032418 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032425 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="config-reloader" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032432 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="prometheus" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032437 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="thanos-sidecar" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032442 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="kube-rbac-proxy-web" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032448 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" containerName="kube-rbac-proxy-thanos" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032453 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6ac3c44-9ac1-4a86-89cc-81579d2d7746" containerName="console" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032459 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" Apr 23 17:48:15.032470 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032464 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" Apr 23 17:48:15.033177 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.032539 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8560822-ef3b-47a7-b587-297593afaeeb" containerName="registry" Apr 23 17:48:15.037367 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.037352 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.040516 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.040499 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 17:48:15.040609 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.040500 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 17:48:15.043664 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.043644 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 17:48:15.044077 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.044059 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 17:48:15.044923 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.044903 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-n5blf\"" Apr 23 17:48:15.045022 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.044912 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 17:48:15.045389 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045372 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045591 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045615 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/753ac946-1a00-4695-be26-362b5112d541-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045651 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045675 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045684 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/753ac946-1a00-4695-be26-362b5112d541-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045713 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045738 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045784 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045823 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/753ac946-1a00-4695-be26-362b5112d541-config-out\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045849 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045878 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045903 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045879 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045935 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-config\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.045974 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046029 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046051 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-fs23asiu67t61\"" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046054 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046134 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046093 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp8qg\" (UniqueName: \"kubernetes.io/projected/753ac946-1a00-4695-be26-362b5112d541-kube-api-access-cp8qg\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046122 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046163 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046187 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-web-config\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046223 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046375 2581 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-trusted-ca-bundle\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046403 2581 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-kube-rbac-proxy\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046420 2581 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046435 2581 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-config\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046450 2581 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-grpc-tls\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046468 2581 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046488 2581 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/397ba0f6-e575-4e5b-926f-be25f031cbb8-prometheus-k8s-db\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046503 2581 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046508 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046517 2581 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046533 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k8wtl\" (UniqueName: \"kubernetes.io/projected/397ba0f6-e575-4e5b-926f-be25f031cbb8-kube-api-access-k8wtl\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046555 2581 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046572 2581 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-prometheus-k8s-tls\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046587 2581 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-web-config\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046608 2581 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/397ba0f6-e575-4e5b-926f-be25f031cbb8-configmap-metrics-client-ca\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046622 2581 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/397ba0f6-e575-4e5b-926f-be25f031cbb8-tls-assets\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.046890 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046636 2581 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/397ba0f6-e575-4e5b-926f-be25f031cbb8-config-out\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.047495 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046650 2581 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-secret-metrics-client-certs\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.047495 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.046664 2581 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/397ba0f6-e575-4e5b-926f-be25f031cbb8-thanos-prometheus-http-client-file\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:15.047495 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.047358 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 17:48:15.051295 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.051273 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 17:48:15.055555 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.055537 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:48:15.147614 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.147590 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.147770 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.147633 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/753ac946-1a00-4695-be26-362b5112d541-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.147770 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.147651 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.147855 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.147784 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/753ac946-1a00-4695-be26-362b5112d541-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.147855 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.147815 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.147855 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.147833 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148036 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.147859 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/753ac946-1a00-4695-be26-362b5112d541-config-out\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148036 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.147901 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148240 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.148212 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148355 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.148252 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148355 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.148285 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-config\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148355 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.148284 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/753ac946-1a00-4695-be26-362b5112d541-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148355 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.148324 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148355 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.148352 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148606 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.148380 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148606 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.148405 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cp8qg\" (UniqueName: \"kubernetes.io/projected/753ac946-1a00-4695-be26-362b5112d541-kube-api-access-cp8qg\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148606 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.148435 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148606 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.148472 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148606 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.148500 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-web-config\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148858 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.148671 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.148858 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.148704 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.149553 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.149276 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.151327 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.150314 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.151327 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.151190 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.151599 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.151454 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-web-config\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.151935 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.151894 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.152614 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.152223 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/753ac946-1a00-4695-be26-362b5112d541-config-out\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.152614 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.152397 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.152614 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.152453 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/753ac946-1a00-4695-be26-362b5112d541-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.152811 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.152618 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-config\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.153289 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.153269 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.153289 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.153284 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.153577 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.153560 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.153745 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.153728 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/753ac946-1a00-4695-be26-362b5112d541-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.154034 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.154017 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/753ac946-1a00-4695-be26-362b5112d541-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.159184 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.159166 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp8qg\" (UniqueName: \"kubernetes.io/projected/753ac946-1a00-4695-be26-362b5112d541-kube-api-access-cp8qg\") pod \"prometheus-k8s-0\" (UID: \"753ac946-1a00-4695-be26-362b5112d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.347705 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.347666 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:15.479949 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.479922 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:48:15.482298 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:48:15.482258 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod753ac946_1a00_4695_be26_362b5112d541.slice/crio-d73e00a25d5287d333617573e2b55d2aaa5d3c9a5bc9f3bc84c681d7e23ad3aa WatchSource:0}: Error finding container d73e00a25d5287d333617573e2b55d2aaa5d3c9a5bc9f3bc84c681d7e23ad3aa: Status 404 returned error can't find the container with id d73e00a25d5287d333617573e2b55d2aaa5d3c9a5bc9f3bc84c681d7e23ad3aa Apr 23 17:48:15.790667 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.790594 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="397ba0f6-e575-4e5b-926f-be25f031cbb8" path="/var/lib/kubelet/pods/397ba0f6-e575-4e5b-926f-be25f031cbb8/volumes" Apr 23 17:48:15.967658 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.967625 2581 generic.go:358] "Generic (PLEG): container finished" podID="753ac946-1a00-4695-be26-362b5112d541" containerID="9f76d52430b287a831fcbbd2fed26e3f778d34df1122dfd0911bc81fd380dc64" exitCode=0 Apr 23 17:48:15.967825 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.967706 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753ac946-1a00-4695-be26-362b5112d541","Type":"ContainerDied","Data":"9f76d52430b287a831fcbbd2fed26e3f778d34df1122dfd0911bc81fd380dc64"} Apr 23 17:48:15.967825 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:15.967744 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753ac946-1a00-4695-be26-362b5112d541","Type":"ContainerStarted","Data":"d73e00a25d5287d333617573e2b55d2aaa5d3c9a5bc9f3bc84c681d7e23ad3aa"} Apr 23 17:48:16.973968 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:16.973927 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753ac946-1a00-4695-be26-362b5112d541","Type":"ContainerStarted","Data":"e3412b9a673f751257215eb5dbea3df6328e5b794a3d07c73fb947c8f7c07969"} Apr 23 17:48:16.973968 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:16.973968 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753ac946-1a00-4695-be26-362b5112d541","Type":"ContainerStarted","Data":"b1584d1b845d305211cf998b0fede8e7802a2a69b58d3d34424686783ff75189"} Apr 23 17:48:16.974462 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:16.974005 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753ac946-1a00-4695-be26-362b5112d541","Type":"ContainerStarted","Data":"bcfd8b823e467c4bdbd23fe1c8280f263f7317f074dcbe6c5e6ee0614bf5d9c0"} Apr 23 17:48:16.974462 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:16.974017 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753ac946-1a00-4695-be26-362b5112d541","Type":"ContainerStarted","Data":"17934a33587fd50da5c5623da677ce367abb63d046436939f17afac6055fdc81"} Apr 23 17:48:16.974462 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:16.974027 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753ac946-1a00-4695-be26-362b5112d541","Type":"ContainerStarted","Data":"c4e7ec87e8f3a628606e96f4d1af4200a46fe0c4f46829db6f564b15629b6e3e"} Apr 23 17:48:16.974462 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:16.974038 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753ac946-1a00-4695-be26-362b5112d541","Type":"ContainerStarted","Data":"e3199461ed939e596a67870613e2931d18c6e62bfc8d5a7e554dbde19cd697a8"} Apr 23 17:48:17.763961 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:17.763909 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.763892752 podStartE2EDuration="2.763892752s" podCreationTimestamp="2026-04-23 17:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:48:17.029787681 +0000 UTC m=+415.779229482" watchObservedRunningTime="2026-04-23 17:48:17.763892752 +0000 UTC m=+416.513334561" Apr 23 17:48:17.764809 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:17.764784 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59db946bf4-rbpjd"] Apr 23 17:48:20.348201 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:20.348147 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:48:42.784123 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:42.784015 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59db946bf4-rbpjd" podUID="8dafd0a5-3293-442f-b120-e2bffad48143" containerName="console" containerID="cri-o://2ec3020916fed38da36390ee47a9cbcfe458d4e27bf8dc48f7300a2d0f416a35" gracePeriod=15 Apr 23 17:48:43.028581 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.028556 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59db946bf4-rbpjd_8dafd0a5-3293-442f-b120-e2bffad48143/console/0.log" Apr 23 17:48:43.028677 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.028615 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:48:43.046857 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.046805 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59db946bf4-rbpjd_8dafd0a5-3293-442f-b120-e2bffad48143/console/0.log" Apr 23 17:48:43.046857 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.046841 2581 generic.go:358] "Generic (PLEG): container finished" podID="8dafd0a5-3293-442f-b120-e2bffad48143" containerID="2ec3020916fed38da36390ee47a9cbcfe458d4e27bf8dc48f7300a2d0f416a35" exitCode=2 Apr 23 17:48:43.047030 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.046895 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59db946bf4-rbpjd" event={"ID":"8dafd0a5-3293-442f-b120-e2bffad48143","Type":"ContainerDied","Data":"2ec3020916fed38da36390ee47a9cbcfe458d4e27bf8dc48f7300a2d0f416a35"} Apr 23 17:48:43.047030 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.046919 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59db946bf4-rbpjd" event={"ID":"8dafd0a5-3293-442f-b120-e2bffad48143","Type":"ContainerDied","Data":"486f149ce8dbc8068655185a29f7df0585c499f7eb6d22c8798bb2458ced99b0"} Apr 23 17:48:43.047030 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.046919 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59db946bf4-rbpjd" Apr 23 17:48:43.047030 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.046934 2581 scope.go:117] "RemoveContainer" containerID="2ec3020916fed38da36390ee47a9cbcfe458d4e27bf8dc48f7300a2d0f416a35" Apr 23 17:48:43.056797 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.056780 2581 scope.go:117] "RemoveContainer" containerID="2ec3020916fed38da36390ee47a9cbcfe458d4e27bf8dc48f7300a2d0f416a35" Apr 23 17:48:43.057100 ip-10-0-143-218 kubenswrapper[2581]: E0423 17:48:43.057072 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ec3020916fed38da36390ee47a9cbcfe458d4e27bf8dc48f7300a2d0f416a35\": container with ID starting with 2ec3020916fed38da36390ee47a9cbcfe458d4e27bf8dc48f7300a2d0f416a35 not found: ID does not exist" containerID="2ec3020916fed38da36390ee47a9cbcfe458d4e27bf8dc48f7300a2d0f416a35" Apr 23 17:48:43.057155 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.057101 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ec3020916fed38da36390ee47a9cbcfe458d4e27bf8dc48f7300a2d0f416a35"} err="failed to get container status \"2ec3020916fed38da36390ee47a9cbcfe458d4e27bf8dc48f7300a2d0f416a35\": rpc error: code = NotFound desc = could not find container \"2ec3020916fed38da36390ee47a9cbcfe458d4e27bf8dc48f7300a2d0f416a35\": container with ID starting with 2ec3020916fed38da36390ee47a9cbcfe458d4e27bf8dc48f7300a2d0f416a35 not found: ID does not exist" Apr 23 17:48:43.169678 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.169647 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-service-ca\") pod \"8dafd0a5-3293-442f-b120-e2bffad48143\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " Apr 23 17:48:43.169678 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.169683 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-trusted-ca-bundle\") pod \"8dafd0a5-3293-442f-b120-e2bffad48143\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " Apr 23 17:48:43.169893 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.169714 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-oauth-serving-cert\") pod \"8dafd0a5-3293-442f-b120-e2bffad48143\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " Apr 23 17:48:43.169949 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.169904 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-console-config\") pod \"8dafd0a5-3293-442f-b120-e2bffad48143\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " Apr 23 17:48:43.170024 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.169945 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dafd0a5-3293-442f-b120-e2bffad48143-console-serving-cert\") pod \"8dafd0a5-3293-442f-b120-e2bffad48143\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " Apr 23 17:48:43.170024 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.170002 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8dafd0a5-3293-442f-b120-e2bffad48143-console-oauth-config\") pod \"8dafd0a5-3293-442f-b120-e2bffad48143\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " Apr 23 17:48:43.170122 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.170056 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8dafd0a5-3293-442f-b120-e2bffad48143" (UID: "8dafd0a5-3293-442f-b120-e2bffad48143"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:48:43.170122 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.170067 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxs2l\" (UniqueName: \"kubernetes.io/projected/8dafd0a5-3293-442f-b120-e2bffad48143-kube-api-access-jxs2l\") pod \"8dafd0a5-3293-442f-b120-e2bffad48143\" (UID: \"8dafd0a5-3293-442f-b120-e2bffad48143\") " Apr 23 17:48:43.170233 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.170106 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-service-ca" (OuterVolumeSpecName: "service-ca") pod "8dafd0a5-3293-442f-b120-e2bffad48143" (UID: "8dafd0a5-3293-442f-b120-e2bffad48143"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:48:43.170233 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.170193 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8dafd0a5-3293-442f-b120-e2bffad48143" (UID: "8dafd0a5-3293-442f-b120-e2bffad48143"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:48:43.170328 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.170304 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-console-config" (OuterVolumeSpecName: "console-config") pod "8dafd0a5-3293-442f-b120-e2bffad48143" (UID: "8dafd0a5-3293-442f-b120-e2bffad48143"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:48:43.170378 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.170359 2581 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-service-ca\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:43.170410 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.170389 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-trusted-ca-bundle\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:43.170410 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.170404 2581 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-oauth-serving-cert\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:43.172219 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.172201 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dafd0a5-3293-442f-b120-e2bffad48143-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8dafd0a5-3293-442f-b120-e2bffad48143" (UID: "8dafd0a5-3293-442f-b120-e2bffad48143"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:48:43.172299 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.172279 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dafd0a5-3293-442f-b120-e2bffad48143-kube-api-access-jxs2l" (OuterVolumeSpecName: "kube-api-access-jxs2l") pod "8dafd0a5-3293-442f-b120-e2bffad48143" (UID: "8dafd0a5-3293-442f-b120-e2bffad48143"). InnerVolumeSpecName "kube-api-access-jxs2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:48:43.172360 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.172323 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dafd0a5-3293-442f-b120-e2bffad48143-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8dafd0a5-3293-442f-b120-e2bffad48143" (UID: "8dafd0a5-3293-442f-b120-e2bffad48143"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:48:43.271201 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.271165 2581 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8dafd0a5-3293-442f-b120-e2bffad48143-console-config\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:43.271201 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.271192 2581 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dafd0a5-3293-442f-b120-e2bffad48143-console-serving-cert\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:43.271201 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.271203 2581 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8dafd0a5-3293-442f-b120-e2bffad48143-console-oauth-config\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:43.271201 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.271212 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jxs2l\" (UniqueName: \"kubernetes.io/projected/8dafd0a5-3293-442f-b120-e2bffad48143-kube-api-access-jxs2l\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 17:48:43.370434 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.370399 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59db946bf4-rbpjd"] Apr 23 17:48:43.374261 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.374233 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59db946bf4-rbpjd"] Apr 23 17:48:43.790346 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:48:43.790312 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dafd0a5-3293-442f-b120-e2bffad48143" path="/var/lib/kubelet/pods/8dafd0a5-3293-442f-b120-e2bffad48143/volumes" Apr 23 17:49:15.347958 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:15.347919 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:49:15.364117 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:15.364093 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:49:16.154428 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:16.154402 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:49:47.149452 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.149417 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w"] Apr 23 17:49:47.150090 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.149683 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8dafd0a5-3293-442f-b120-e2bffad48143" containerName="console" Apr 23 17:49:47.150090 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.149693 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dafd0a5-3293-442f-b120-e2bffad48143" containerName="console" Apr 23 17:49:47.150090 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.149748 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="8dafd0a5-3293-442f-b120-e2bffad48143" containerName="console" Apr 23 17:49:47.152906 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.152886 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.160580 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.160559 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 17:49:47.161075 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.161055 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 17:49:47.161180 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.161057 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 17:49:47.161747 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.161734 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 17:49:47.161930 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.161911 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 17:49:47.162014 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.161913 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 17:49:47.162014 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.161936 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 17:49:47.174348 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.174324 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w"] Apr 23 17:49:47.276288 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.276257 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8352970c-aaa0-4124-a0f9-443e3843c99d-ca\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.276288 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.276291 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpztg\" (UniqueName: \"kubernetes.io/projected/8352970c-aaa0-4124-a0f9-443e3843c99d-kube-api-access-rpztg\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.276479 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.276318 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8352970c-aaa0-4124-a0f9-443e3843c99d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.276479 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.276367 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8352970c-aaa0-4124-a0f9-443e3843c99d-hub\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.276479 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.276413 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8352970c-aaa0-4124-a0f9-443e3843c99d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.276479 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.276458 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8352970c-aaa0-4124-a0f9-443e3843c99d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.377457 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.377418 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8352970c-aaa0-4124-a0f9-443e3843c99d-ca\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.377714 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.377465 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpztg\" (UniqueName: \"kubernetes.io/projected/8352970c-aaa0-4124-a0f9-443e3843c99d-kube-api-access-rpztg\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.377714 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.377510 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8352970c-aaa0-4124-a0f9-443e3843c99d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.377714 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.377531 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8352970c-aaa0-4124-a0f9-443e3843c99d-hub\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.377714 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.377558 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8352970c-aaa0-4124-a0f9-443e3843c99d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.377714 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.377615 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8352970c-aaa0-4124-a0f9-443e3843c99d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.378576 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.378546 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8352970c-aaa0-4124-a0f9-443e3843c99d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.380321 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.380272 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8352970c-aaa0-4124-a0f9-443e3843c99d-ca\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.380428 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.380397 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8352970c-aaa0-4124-a0f9-443e3843c99d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.380478 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.380457 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8352970c-aaa0-4124-a0f9-443e3843c99d-hub\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.380573 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.380552 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8352970c-aaa0-4124-a0f9-443e3843c99d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.386350 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.386327 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpztg\" (UniqueName: \"kubernetes.io/projected/8352970c-aaa0-4124-a0f9-443e3843c99d-kube-api-access-rpztg\") pod \"cluster-proxy-proxy-agent-f6fcf4959-mq66w\" (UID: \"8352970c-aaa0-4124-a0f9-443e3843c99d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.476239 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.476136 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" Apr 23 17:49:47.597535 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:47.597502 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w"] Apr 23 17:49:47.600556 ip-10-0-143-218 kubenswrapper[2581]: W0423 17:49:47.600528 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8352970c_aaa0_4124_a0f9_443e3843c99d.slice/crio-01036a4a1df86167d16cc879bce2d4cb3bc2f721c5c7b64a21fcf3c84279eb96 WatchSource:0}: Error finding container 01036a4a1df86167d16cc879bce2d4cb3bc2f721c5c7b64a21fcf3c84279eb96: Status 404 returned error can't find the container with id 01036a4a1df86167d16cc879bce2d4cb3bc2f721c5c7b64a21fcf3c84279eb96 Apr 23 17:49:48.225014 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:48.224959 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" event={"ID":"8352970c-aaa0-4124-a0f9-443e3843c99d","Type":"ContainerStarted","Data":"01036a4a1df86167d16cc879bce2d4cb3bc2f721c5c7b64a21fcf3c84279eb96"} Apr 23 17:49:51.237074 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:51.237035 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" event={"ID":"8352970c-aaa0-4124-a0f9-443e3843c99d","Type":"ContainerStarted","Data":"845ea48dfec24c060d2b26f50cebb4d55de10ffbbf3a244a900a70a0a6ccb06c"} Apr 23 17:49:52.241013 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:52.240967 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" event={"ID":"8352970c-aaa0-4124-a0f9-443e3843c99d","Type":"ContainerStarted","Data":"e29824e232a04472ffeaadd2b6f8bee6516722bd8e2823f943563ccb97418c62"} Apr 23 17:49:53.245482 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:53.245445 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" event={"ID":"8352970c-aaa0-4124-a0f9-443e3843c99d","Type":"ContainerStarted","Data":"fea3e537a2b0af85d3b74e88e7570735d95d5be0cb07c35a2fa20fc458978f6b"} Apr 23 17:49:53.273518 ip-10-0-143-218 kubenswrapper[2581]: I0423 17:49:53.273467 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fcf4959-mq66w" podStartSLOduration=1.7267237560000002 podStartE2EDuration="6.273452427s" podCreationTimestamp="2026-04-23 17:49:47 +0000 UTC" firstStartedPulling="2026-04-23 17:49:47.602278042 +0000 UTC m=+506.351719823" lastFinishedPulling="2026-04-23 17:49:52.149006706 +0000 UTC m=+510.898448494" observedRunningTime="2026-04-23 17:49:53.270340034 +0000 UTC m=+512.019781835" watchObservedRunningTime="2026-04-23 17:49:53.273452427 +0000 UTC m=+512.022894227" Apr 23 18:17:23.257370 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.257291 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bc7774745-77s29"] Apr 23 18:17:23.260394 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.260378 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.261202 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.261176 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddw8c\" (UniqueName: \"kubernetes.io/projected/942a95ab-2e00-4c6a-b0df-4c42d578f689-kube-api-access-ddw8c\") pod \"lws-controller-manager-7bc7774745-77s29\" (UID: \"942a95ab-2e00-4c6a-b0df-4c42d578f689\") " pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.261351 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.261216 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/942a95ab-2e00-4c6a-b0df-4c42d578f689-metrics-cert\") pod \"lws-controller-manager-7bc7774745-77s29\" (UID: \"942a95ab-2e00-4c6a-b0df-4c42d578f689\") " pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.261351 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.261236 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/942a95ab-2e00-4c6a-b0df-4c42d578f689-manager-config\") pod \"lws-controller-manager-7bc7774745-77s29\" (UID: \"942a95ab-2e00-4c6a-b0df-4c42d578f689\") " pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.261351 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.261320 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/942a95ab-2e00-4c6a-b0df-4c42d578f689-cert\") pod \"lws-controller-manager-7bc7774745-77s29\" (UID: \"942a95ab-2e00-4c6a-b0df-4c42d578f689\") " pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.263545 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.263529 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 23 18:17:23.263697 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.263677 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-xlk9l\"" Apr 23 18:17:23.264995 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.264968 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 23 18:17:23.265121 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.265061 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 23 18:17:23.265121 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.265077 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 23 18:17:23.265314 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.265295 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 23 18:17:23.273959 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.273936 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bc7774745-77s29"] Apr 23 18:17:23.362824 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.362786 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddw8c\" (UniqueName: \"kubernetes.io/projected/942a95ab-2e00-4c6a-b0df-4c42d578f689-kube-api-access-ddw8c\") pod \"lws-controller-manager-7bc7774745-77s29\" (UID: \"942a95ab-2e00-4c6a-b0df-4c42d578f689\") " pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.363031 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.362852 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/942a95ab-2e00-4c6a-b0df-4c42d578f689-metrics-cert\") pod \"lws-controller-manager-7bc7774745-77s29\" (UID: \"942a95ab-2e00-4c6a-b0df-4c42d578f689\") " pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.363031 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.362891 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/942a95ab-2e00-4c6a-b0df-4c42d578f689-manager-config\") pod \"lws-controller-manager-7bc7774745-77s29\" (UID: \"942a95ab-2e00-4c6a-b0df-4c42d578f689\") " pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.363031 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.362933 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/942a95ab-2e00-4c6a-b0df-4c42d578f689-cert\") pod \"lws-controller-manager-7bc7774745-77s29\" (UID: \"942a95ab-2e00-4c6a-b0df-4c42d578f689\") " pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.365774 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.365736 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/942a95ab-2e00-4c6a-b0df-4c42d578f689-manager-config\") pod \"lws-controller-manager-7bc7774745-77s29\" (UID: \"942a95ab-2e00-4c6a-b0df-4c42d578f689\") " pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.367214 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.367189 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/942a95ab-2e00-4c6a-b0df-4c42d578f689-cert\") pod \"lws-controller-manager-7bc7774745-77s29\" (UID: \"942a95ab-2e00-4c6a-b0df-4c42d578f689\") " pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.367350 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.367249 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/942a95ab-2e00-4c6a-b0df-4c42d578f689-metrics-cert\") pod \"lws-controller-manager-7bc7774745-77s29\" (UID: \"942a95ab-2e00-4c6a-b0df-4c42d578f689\") " pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.372644 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.372623 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddw8c\" (UniqueName: \"kubernetes.io/projected/942a95ab-2e00-4c6a-b0df-4c42d578f689-kube-api-access-ddw8c\") pod \"lws-controller-manager-7bc7774745-77s29\" (UID: \"942a95ab-2e00-4c6a-b0df-4c42d578f689\") " pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.570032 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.569937 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:23.695409 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.695385 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bc7774745-77s29"] Apr 23 18:17:23.698078 ip-10-0-143-218 kubenswrapper[2581]: W0423 18:17:23.698039 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod942a95ab_2e00_4c6a_b0df_4c42d578f689.slice/crio-04cdbcdc3f0d39bae5d5dc84adf665762505ed94dc05bbd1f1332de35d0e780f WatchSource:0}: Error finding container 04cdbcdc3f0d39bae5d5dc84adf665762505ed94dc05bbd1f1332de35d0e780f: Status 404 returned error can't find the container with id 04cdbcdc3f0d39bae5d5dc84adf665762505ed94dc05bbd1f1332de35d0e780f Apr 23 18:17:23.699769 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.699749 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:17:23.723578 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:23.723548 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" event={"ID":"942a95ab-2e00-4c6a-b0df-4c42d578f689","Type":"ContainerStarted","Data":"04cdbcdc3f0d39bae5d5dc84adf665762505ed94dc05bbd1f1332de35d0e780f"} Apr 23 18:17:26.732687 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:26.732650 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" event={"ID":"942a95ab-2e00-4c6a-b0df-4c42d578f689","Type":"ContainerStarted","Data":"e9ece03de4c6c82117c2cbed73649b8a96a5559da292e8cee9985febef080497"} Apr 23 18:17:26.733177 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:26.732700 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:17:26.752615 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:26.752567 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" podStartSLOduration=1.3601451820000001 podStartE2EDuration="3.752553923s" podCreationTimestamp="2026-04-23 18:17:23 +0000 UTC" firstStartedPulling="2026-04-23 18:17:23.699876934 +0000 UTC m=+2162.449318713" lastFinishedPulling="2026-04-23 18:17:26.092285674 +0000 UTC m=+2164.841727454" observedRunningTime="2026-04-23 18:17:26.751285417 +0000 UTC m=+2165.500727239" watchObservedRunningTime="2026-04-23 18:17:26.752553923 +0000 UTC m=+2165.501995723" Apr 23 18:17:37.737975 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:17:37.737942 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7bc7774745-77s29" Apr 23 18:18:10.656411 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:10.656376 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc"] Apr 23 18:18:10.659527 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:10.659506 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc" Apr 23 18:18:10.662638 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:10.662600 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 23 18:18:10.662798 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:10.662776 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-qr7l9\"" Apr 23 18:18:10.662852 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:10.662790 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 23 18:18:10.669886 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:10.669867 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc"] Apr 23 18:18:10.754534 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:10.754497 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxdmk\" (UniqueName: \"kubernetes.io/projected/9c520546-a736-40a3-8afe-578967729121-kube-api-access-mxdmk\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-s54pc\" (UID: \"9c520546-a736-40a3-8afe-578967729121\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc" Apr 23 18:18:10.754689 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:10.754548 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9c520546-a736-40a3-8afe-578967729121-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-s54pc\" (UID: \"9c520546-a736-40a3-8afe-578967729121\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc" Apr 23 18:18:10.855190 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:10.855155 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxdmk\" (UniqueName: \"kubernetes.io/projected/9c520546-a736-40a3-8afe-578967729121-kube-api-access-mxdmk\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-s54pc\" (UID: \"9c520546-a736-40a3-8afe-578967729121\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc" Apr 23 18:18:10.855354 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:10.855197 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9c520546-a736-40a3-8afe-578967729121-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-s54pc\" (UID: \"9c520546-a736-40a3-8afe-578967729121\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc" Apr 23 18:18:10.855578 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:10.855560 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9c520546-a736-40a3-8afe-578967729121-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-s54pc\" (UID: \"9c520546-a736-40a3-8afe-578967729121\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc" Apr 23 18:18:10.865318 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:10.865290 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxdmk\" (UniqueName: \"kubernetes.io/projected/9c520546-a736-40a3-8afe-578967729121-kube-api-access-mxdmk\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-s54pc\" (UID: \"9c520546-a736-40a3-8afe-578967729121\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc" Apr 23 18:18:10.970445 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:10.970360 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc" Apr 23 18:18:11.128359 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:11.128324 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc"] Apr 23 18:18:11.131573 ip-10-0-143-218 kubenswrapper[2581]: W0423 18:18:11.131547 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c520546_a736_40a3_8afe_578967729121.slice/crio-77a026ace63419ac68b2e8ecf871a6e8d40e2b72f46dadc5c215c6320d2fa25b WatchSource:0}: Error finding container 77a026ace63419ac68b2e8ecf871a6e8d40e2b72f46dadc5c215c6320d2fa25b: Status 404 returned error can't find the container with id 77a026ace63419ac68b2e8ecf871a6e8d40e2b72f46dadc5c215c6320d2fa25b Apr 23 18:18:11.859540 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:11.859486 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc" event={"ID":"9c520546-a736-40a3-8afe-578967729121","Type":"ContainerStarted","Data":"77a026ace63419ac68b2e8ecf871a6e8d40e2b72f46dadc5c215c6320d2fa25b"} Apr 23 18:18:15.874878 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:15.874841 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc" event={"ID":"9c520546-a736-40a3-8afe-578967729121","Type":"ContainerStarted","Data":"d4a96aaeb5beac00cd40af7fe68ed5d6708e631d286bc35d4bb2076c11fdca7e"} Apr 23 18:18:15.875338 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:15.874943 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc" Apr 23 18:18:15.898730 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:15.898682 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc" podStartSLOduration=1.8212192759999999 podStartE2EDuration="5.898668782s" podCreationTimestamp="2026-04-23 18:18:10 +0000 UTC" firstStartedPulling="2026-04-23 18:18:11.13400367 +0000 UTC m=+2209.883445464" lastFinishedPulling="2026-04-23 18:18:15.211453187 +0000 UTC m=+2213.960894970" observedRunningTime="2026-04-23 18:18:15.897218283 +0000 UTC m=+2214.646660085" watchObservedRunningTime="2026-04-23 18:18:15.898668782 +0000 UTC m=+2214.648110584" Apr 23 18:18:26.880449 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:18:26.880412 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-s54pc" Apr 23 18:19:01.288492 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.288456 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-lhq28"] Apr 23 18:19:01.291792 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.291771 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" Apr 23 18:19:01.294586 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.294556 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-m4rdk\"" Apr 23 18:19:01.294688 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.294560 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 23 18:19:01.302213 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.302191 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-lhq28"] Apr 23 18:19:01.360003 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.359955 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fad9dbd1-783a-41d8-9ca4-1c23af66451b-config-file\") pod \"limitador-limitador-64c8f475fb-lhq28\" (UID: \"fad9dbd1-783a-41d8-9ca4-1c23af66451b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" Apr 23 18:19:01.360162 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.360067 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8vsp\" (UniqueName: \"kubernetes.io/projected/fad9dbd1-783a-41d8-9ca4-1c23af66451b-kube-api-access-x8vsp\") pod \"limitador-limitador-64c8f475fb-lhq28\" (UID: \"fad9dbd1-783a-41d8-9ca4-1c23af66451b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" Apr 23 18:19:01.380380 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.380354 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-lhq28"] Apr 23 18:19:01.461103 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.461068 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fad9dbd1-783a-41d8-9ca4-1c23af66451b-config-file\") pod \"limitador-limitador-64c8f475fb-lhq28\" (UID: \"fad9dbd1-783a-41d8-9ca4-1c23af66451b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" Apr 23 18:19:01.461278 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.461128 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8vsp\" (UniqueName: \"kubernetes.io/projected/fad9dbd1-783a-41d8-9ca4-1c23af66451b-kube-api-access-x8vsp\") pod \"limitador-limitador-64c8f475fb-lhq28\" (UID: \"fad9dbd1-783a-41d8-9ca4-1c23af66451b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" Apr 23 18:19:01.461692 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.461674 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fad9dbd1-783a-41d8-9ca4-1c23af66451b-config-file\") pod \"limitador-limitador-64c8f475fb-lhq28\" (UID: \"fad9dbd1-783a-41d8-9ca4-1c23af66451b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" Apr 23 18:19:01.470572 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.470545 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8vsp\" (UniqueName: \"kubernetes.io/projected/fad9dbd1-783a-41d8-9ca4-1c23af66451b-kube-api-access-x8vsp\") pod \"limitador-limitador-64c8f475fb-lhq28\" (UID: \"fad9dbd1-783a-41d8-9ca4-1c23af66451b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" Apr 23 18:19:01.602626 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.602552 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" Apr 23 18:19:01.734960 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:01.734928 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-lhq28"] Apr 23 18:19:01.738287 ip-10-0-143-218 kubenswrapper[2581]: W0423 18:19:01.738261 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad9dbd1_783a_41d8_9ca4_1c23af66451b.slice/crio-db4f613d837392c4287260971f99f9396c8ad29dfa12008989dd41f217d76f58 WatchSource:0}: Error finding container db4f613d837392c4287260971f99f9396c8ad29dfa12008989dd41f217d76f58: Status 404 returned error can't find the container with id db4f613d837392c4287260971f99f9396c8ad29dfa12008989dd41f217d76f58 Apr 23 18:19:02.011483 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:02.011444 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" event={"ID":"fad9dbd1-783a-41d8-9ca4-1c23af66451b","Type":"ContainerStarted","Data":"db4f613d837392c4287260971f99f9396c8ad29dfa12008989dd41f217d76f58"} Apr 23 18:19:06.025314 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:06.025277 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" event={"ID":"fad9dbd1-783a-41d8-9ca4-1c23af66451b","Type":"ContainerStarted","Data":"af9c076350bd5c371ae8b7a95b0e7591776921f0434815e1c95433e2d1a5778c"} Apr 23 18:19:06.025715 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:06.025396 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" Apr 23 18:19:06.044748 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:06.044707 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" podStartSLOduration=1.681388771 podStartE2EDuration="5.044695261s" podCreationTimestamp="2026-04-23 18:19:01 +0000 UTC" firstStartedPulling="2026-04-23 18:19:01.7401044 +0000 UTC m=+2260.489546178" lastFinishedPulling="2026-04-23 18:19:05.103410888 +0000 UTC m=+2263.852852668" observedRunningTime="2026-04-23 18:19:06.042918863 +0000 UTC m=+2264.792360889" watchObservedRunningTime="2026-04-23 18:19:06.044695261 +0000 UTC m=+2264.794137062" Apr 23 18:19:14.855407 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:14.855367 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-lhq28"] Apr 23 18:19:14.855937 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:14.855613 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" podUID="fad9dbd1-783a-41d8-9ca4-1c23af66451b" containerName="limitador" containerID="cri-o://af9c076350bd5c371ae8b7a95b0e7591776921f0434815e1c95433e2d1a5778c" gracePeriod=30 Apr 23 18:19:14.856292 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:14.856271 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" Apr 23 18:19:15.383798 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:15.383759 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" Apr 23 18:19:15.488424 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:15.488332 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fad9dbd1-783a-41d8-9ca4-1c23af66451b-config-file\") pod \"fad9dbd1-783a-41d8-9ca4-1c23af66451b\" (UID: \"fad9dbd1-783a-41d8-9ca4-1c23af66451b\") " Apr 23 18:19:15.488424 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:15.488408 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8vsp\" (UniqueName: \"kubernetes.io/projected/fad9dbd1-783a-41d8-9ca4-1c23af66451b-kube-api-access-x8vsp\") pod \"fad9dbd1-783a-41d8-9ca4-1c23af66451b\" (UID: \"fad9dbd1-783a-41d8-9ca4-1c23af66451b\") " Apr 23 18:19:15.488744 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:15.488720 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fad9dbd1-783a-41d8-9ca4-1c23af66451b-config-file" (OuterVolumeSpecName: "config-file") pod "fad9dbd1-783a-41d8-9ca4-1c23af66451b" (UID: "fad9dbd1-783a-41d8-9ca4-1c23af66451b"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:19:15.490782 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:15.490751 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fad9dbd1-783a-41d8-9ca4-1c23af66451b-kube-api-access-x8vsp" (OuterVolumeSpecName: "kube-api-access-x8vsp") pod "fad9dbd1-783a-41d8-9ca4-1c23af66451b" (UID: "fad9dbd1-783a-41d8-9ca4-1c23af66451b"). InnerVolumeSpecName "kube-api-access-x8vsp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:19:15.589124 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:15.589091 2581 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fad9dbd1-783a-41d8-9ca4-1c23af66451b-config-file\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 18:19:15.589124 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:15.589119 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x8vsp\" (UniqueName: \"kubernetes.io/projected/fad9dbd1-783a-41d8-9ca4-1c23af66451b-kube-api-access-x8vsp\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 18:19:16.057871 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:16.057784 2581 generic.go:358] "Generic (PLEG): container finished" podID="fad9dbd1-783a-41d8-9ca4-1c23af66451b" containerID="af9c076350bd5c371ae8b7a95b0e7591776921f0434815e1c95433e2d1a5778c" exitCode=0 Apr 23 18:19:16.057871 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:16.057866 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" Apr 23 18:19:16.058338 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:16.057861 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" event={"ID":"fad9dbd1-783a-41d8-9ca4-1c23af66451b","Type":"ContainerDied","Data":"af9c076350bd5c371ae8b7a95b0e7591776921f0434815e1c95433e2d1a5778c"} Apr 23 18:19:16.058338 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:16.057969 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-lhq28" event={"ID":"fad9dbd1-783a-41d8-9ca4-1c23af66451b","Type":"ContainerDied","Data":"db4f613d837392c4287260971f99f9396c8ad29dfa12008989dd41f217d76f58"} Apr 23 18:19:16.058338 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:16.058012 2581 scope.go:117] "RemoveContainer" containerID="af9c076350bd5c371ae8b7a95b0e7591776921f0434815e1c95433e2d1a5778c" Apr 23 18:19:16.066016 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:16.065977 2581 scope.go:117] "RemoveContainer" containerID="af9c076350bd5c371ae8b7a95b0e7591776921f0434815e1c95433e2d1a5778c" Apr 23 18:19:16.066304 ip-10-0-143-218 kubenswrapper[2581]: E0423 18:19:16.066285 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af9c076350bd5c371ae8b7a95b0e7591776921f0434815e1c95433e2d1a5778c\": container with ID starting with af9c076350bd5c371ae8b7a95b0e7591776921f0434815e1c95433e2d1a5778c not found: ID does not exist" containerID="af9c076350bd5c371ae8b7a95b0e7591776921f0434815e1c95433e2d1a5778c" Apr 23 18:19:16.066355 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:16.066317 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9c076350bd5c371ae8b7a95b0e7591776921f0434815e1c95433e2d1a5778c"} err="failed to get container status \"af9c076350bd5c371ae8b7a95b0e7591776921f0434815e1c95433e2d1a5778c\": rpc error: code = NotFound desc = could not find container \"af9c076350bd5c371ae8b7a95b0e7591776921f0434815e1c95433e2d1a5778c\": container with ID starting with af9c076350bd5c371ae8b7a95b0e7591776921f0434815e1c95433e2d1a5778c not found: ID does not exist" Apr 23 18:19:16.077605 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:16.077578 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-lhq28"] Apr 23 18:19:16.081338 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:16.081317 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-lhq28"] Apr 23 18:19:17.790862 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:17.790825 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fad9dbd1-783a-41d8-9ca4-1c23af66451b" path="/var/lib/kubelet/pods/fad9dbd1-783a-41d8-9ca4-1c23af66451b/volumes" Apr 23 18:19:44.742318 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.742285 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-rx7vk"] Apr 23 18:19:44.742673 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.742572 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fad9dbd1-783a-41d8-9ca4-1c23af66451b" containerName="limitador" Apr 23 18:19:44.742673 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.742583 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad9dbd1-783a-41d8-9ca4-1c23af66451b" containerName="limitador" Apr 23 18:19:44.742673 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.742644 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="fad9dbd1-783a-41d8-9ca4-1c23af66451b" containerName="limitador" Apr 23 18:19:44.760041 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.760010 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-rx7vk"] Apr 23 18:19:44.760185 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.760118 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" Apr 23 18:19:44.763091 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.763067 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 18:19:44.763225 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.763068 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-2zc2n\"" Apr 23 18:19:44.763464 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.763440 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb"] Apr 23 18:19:44.764255 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.764231 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 18:19:44.764255 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.764248 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 23 18:19:44.766915 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.766894 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" Apr 23 18:19:44.769570 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.769548 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 23 18:19:44.769655 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.769598 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-kjrgr\"" Apr 23 18:19:44.777902 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.777882 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb"] Apr 23 18:19:44.804314 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.804278 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-cert\") pod \"kserve-controller-manager-6fc5d867c5-rx7vk\" (UID: \"d51eebd9-fc6d-4f64-8766-0c8b4d541c76\") " pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" Apr 23 18:19:44.804464 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.804327 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pr9g\" (UniqueName: \"kubernetes.io/projected/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-kube-api-access-5pr9g\") pod \"llmisvc-controller-manager-56c5fd6bcc-x6vxb\" (UID: \"054b45bb-dfb8-45b7-a704-26b96dd9c2fe\") " pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" Apr 23 18:19:44.804464 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.804354 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-cert\") pod \"llmisvc-controller-manager-56c5fd6bcc-x6vxb\" (UID: \"054b45bb-dfb8-45b7-a704-26b96dd9c2fe\") " pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" Apr 23 18:19:44.804464 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.804416 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx9hw\" (UniqueName: \"kubernetes.io/projected/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-kube-api-access-dx9hw\") pod \"kserve-controller-manager-6fc5d867c5-rx7vk\" (UID: \"d51eebd9-fc6d-4f64-8766-0c8b4d541c76\") " pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" Apr 23 18:19:44.905793 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.905764 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pr9g\" (UniqueName: \"kubernetes.io/projected/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-kube-api-access-5pr9g\") pod \"llmisvc-controller-manager-56c5fd6bcc-x6vxb\" (UID: \"054b45bb-dfb8-45b7-a704-26b96dd9c2fe\") " pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" Apr 23 18:19:44.905793 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.905798 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-cert\") pod \"llmisvc-controller-manager-56c5fd6bcc-x6vxb\" (UID: \"054b45bb-dfb8-45b7-a704-26b96dd9c2fe\") " pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" Apr 23 18:19:44.906034 ip-10-0-143-218 kubenswrapper[2581]: E0423 18:19:44.905963 2581 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 23 18:19:44.906034 ip-10-0-143-218 kubenswrapper[2581]: E0423 18:19:44.906030 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-cert podName:054b45bb-dfb8-45b7-a704-26b96dd9c2fe nodeName:}" failed. No retries permitted until 2026-04-23 18:19:45.406014829 +0000 UTC m=+2304.155456607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-cert") pod "llmisvc-controller-manager-56c5fd6bcc-x6vxb" (UID: "054b45bb-dfb8-45b7-a704-26b96dd9c2fe") : secret "llmisvc-webhook-server-cert" not found Apr 23 18:19:44.906147 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.906073 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dx9hw\" (UniqueName: \"kubernetes.io/projected/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-kube-api-access-dx9hw\") pod \"kserve-controller-manager-6fc5d867c5-rx7vk\" (UID: \"d51eebd9-fc6d-4f64-8766-0c8b4d541c76\") " pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" Apr 23 18:19:44.906184 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.906143 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-cert\") pod \"kserve-controller-manager-6fc5d867c5-rx7vk\" (UID: \"d51eebd9-fc6d-4f64-8766-0c8b4d541c76\") " pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" Apr 23 18:19:44.906269 ip-10-0-143-218 kubenswrapper[2581]: E0423 18:19:44.906253 2581 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 23 18:19:44.906334 ip-10-0-143-218 kubenswrapper[2581]: E0423 18:19:44.906324 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-cert podName:d51eebd9-fc6d-4f64-8766-0c8b4d541c76 nodeName:}" failed. No retries permitted until 2026-04-23 18:19:45.406305654 +0000 UTC m=+2304.155747447 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-cert") pod "kserve-controller-manager-6fc5d867c5-rx7vk" (UID: "d51eebd9-fc6d-4f64-8766-0c8b4d541c76") : secret "kserve-webhook-server-cert" not found Apr 23 18:19:44.916433 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.916403 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx9hw\" (UniqueName: \"kubernetes.io/projected/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-kube-api-access-dx9hw\") pod \"kserve-controller-manager-6fc5d867c5-rx7vk\" (UID: \"d51eebd9-fc6d-4f64-8766-0c8b4d541c76\") " pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" Apr 23 18:19:44.916630 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:44.916607 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pr9g\" (UniqueName: \"kubernetes.io/projected/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-kube-api-access-5pr9g\") pod \"llmisvc-controller-manager-56c5fd6bcc-x6vxb\" (UID: \"054b45bb-dfb8-45b7-a704-26b96dd9c2fe\") " pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" Apr 23 18:19:45.410878 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:45.410846 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-cert\") pod \"llmisvc-controller-manager-56c5fd6bcc-x6vxb\" (UID: \"054b45bb-dfb8-45b7-a704-26b96dd9c2fe\") " pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" Apr 23 18:19:45.411059 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:45.410923 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-cert\") pod \"kserve-controller-manager-6fc5d867c5-rx7vk\" (UID: \"d51eebd9-fc6d-4f64-8766-0c8b4d541c76\") " pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" Apr 23 18:19:45.413282 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:45.413252 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-cert\") pod \"kserve-controller-manager-6fc5d867c5-rx7vk\" (UID: \"d51eebd9-fc6d-4f64-8766-0c8b4d541c76\") " pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" Apr 23 18:19:45.413389 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:45.413303 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-cert\") pod \"llmisvc-controller-manager-56c5fd6bcc-x6vxb\" (UID: \"054b45bb-dfb8-45b7-a704-26b96dd9c2fe\") " pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" Apr 23 18:19:45.671250 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:45.671160 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" Apr 23 18:19:45.678915 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:45.678888 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" Apr 23 18:19:45.802746 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:45.802711 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-rx7vk"] Apr 23 18:19:45.805669 ip-10-0-143-218 kubenswrapper[2581]: W0423 18:19:45.805642 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd51eebd9_fc6d_4f64_8766_0c8b4d541c76.slice/crio-de2d0d57038488eb6fa307b5d9f8c4d58452c57896d0a3f4165095f89ffff25c WatchSource:0}: Error finding container de2d0d57038488eb6fa307b5d9f8c4d58452c57896d0a3f4165095f89ffff25c: Status 404 returned error can't find the container with id de2d0d57038488eb6fa307b5d9f8c4d58452c57896d0a3f4165095f89ffff25c Apr 23 18:19:45.830390 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:45.830323 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb"] Apr 23 18:19:45.832553 ip-10-0-143-218 kubenswrapper[2581]: W0423 18:19:45.832527 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod054b45bb_dfb8_45b7_a704_26b96dd9c2fe.slice/crio-fae4fe523d6acc9e46096f5f59355d9b65176f223f8e476d4e69dd0c17d09007 WatchSource:0}: Error finding container fae4fe523d6acc9e46096f5f59355d9b65176f223f8e476d4e69dd0c17d09007: Status 404 returned error can't find the container with id fae4fe523d6acc9e46096f5f59355d9b65176f223f8e476d4e69dd0c17d09007 Apr 23 18:19:46.159785 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:46.159749 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" event={"ID":"d51eebd9-fc6d-4f64-8766-0c8b4d541c76","Type":"ContainerStarted","Data":"de2d0d57038488eb6fa307b5d9f8c4d58452c57896d0a3f4165095f89ffff25c"} Apr 23 18:19:46.160650 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:46.160622 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" event={"ID":"054b45bb-dfb8-45b7-a704-26b96dd9c2fe","Type":"ContainerStarted","Data":"fae4fe523d6acc9e46096f5f59355d9b65176f223f8e476d4e69dd0c17d09007"} Apr 23 18:19:50.174285 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:50.174239 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" event={"ID":"d51eebd9-fc6d-4f64-8766-0c8b4d541c76","Type":"ContainerStarted","Data":"84a6ac0d1d1d00b3ea71876ce8643846283148126d9a02a18d7702b3c0f7a49e"} Apr 23 18:19:50.174748 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:50.174345 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" Apr 23 18:19:50.175689 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:50.175663 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" event={"ID":"054b45bb-dfb8-45b7-a704-26b96dd9c2fe","Type":"ContainerStarted","Data":"add0226a0538bea08fbb1133f1093025a60ac2817ebaf4633f99c93482a259b7"} Apr 23 18:19:50.175870 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:50.175848 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" Apr 23 18:19:50.195635 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:50.195590 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" podStartSLOduration=2.3469434590000002 podStartE2EDuration="6.195579592s" podCreationTimestamp="2026-04-23 18:19:44 +0000 UTC" firstStartedPulling="2026-04-23 18:19:45.807037203 +0000 UTC m=+2304.556478982" lastFinishedPulling="2026-04-23 18:19:49.655673321 +0000 UTC m=+2308.405115115" observedRunningTime="2026-04-23 18:19:50.194846927 +0000 UTC m=+2308.944288728" watchObservedRunningTime="2026-04-23 18:19:50.195579592 +0000 UTC m=+2308.945021392" Apr 23 18:19:50.212371 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:19:50.212325 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" podStartSLOduration=2.390207129 podStartE2EDuration="6.212314557s" podCreationTimestamp="2026-04-23 18:19:44 +0000 UTC" firstStartedPulling="2026-04-23 18:19:45.833826264 +0000 UTC m=+2304.583268042" lastFinishedPulling="2026-04-23 18:19:49.65593369 +0000 UTC m=+2308.405375470" observedRunningTime="2026-04-23 18:19:50.211281005 +0000 UTC m=+2308.960722807" watchObservedRunningTime="2026-04-23 18:19:50.212314557 +0000 UTC m=+2308.961756358" Apr 23 18:20:21.181025 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:21.180925 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" Apr 23 18:20:21.183829 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:21.183805 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" Apr 23 18:20:22.462890 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.462833 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-rx7vk"] Apr 23 18:20:22.463328 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.463081 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" podUID="d51eebd9-fc6d-4f64-8766-0c8b4d541c76" containerName="manager" containerID="cri-o://84a6ac0d1d1d00b3ea71876ce8643846283148126d9a02a18d7702b3c0f7a49e" gracePeriod=10 Apr 23 18:20:22.489715 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.489686 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-mgfdv"] Apr 23 18:20:22.493053 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.493036 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-mgfdv" Apr 23 18:20:22.503409 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.503384 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-mgfdv"] Apr 23 18:20:22.628467 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.628397 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg85x\" (UniqueName: \"kubernetes.io/projected/c73bb32b-bb03-4a22-9ddd-60b15154a275-kube-api-access-dg85x\") pod \"kserve-controller-manager-6fc5d867c5-mgfdv\" (UID: \"c73bb32b-bb03-4a22-9ddd-60b15154a275\") " pod="kserve/kserve-controller-manager-6fc5d867c5-mgfdv" Apr 23 18:20:22.628603 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.628501 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c73bb32b-bb03-4a22-9ddd-60b15154a275-cert\") pod \"kserve-controller-manager-6fc5d867c5-mgfdv\" (UID: \"c73bb32b-bb03-4a22-9ddd-60b15154a275\") " pod="kserve/kserve-controller-manager-6fc5d867c5-mgfdv" Apr 23 18:20:22.698061 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.698039 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" Apr 23 18:20:22.729743 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.729660 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg85x\" (UniqueName: \"kubernetes.io/projected/c73bb32b-bb03-4a22-9ddd-60b15154a275-kube-api-access-dg85x\") pod \"kserve-controller-manager-6fc5d867c5-mgfdv\" (UID: \"c73bb32b-bb03-4a22-9ddd-60b15154a275\") " pod="kserve/kserve-controller-manager-6fc5d867c5-mgfdv" Apr 23 18:20:22.729743 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.729697 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c73bb32b-bb03-4a22-9ddd-60b15154a275-cert\") pod \"kserve-controller-manager-6fc5d867c5-mgfdv\" (UID: \"c73bb32b-bb03-4a22-9ddd-60b15154a275\") " pod="kserve/kserve-controller-manager-6fc5d867c5-mgfdv" Apr 23 18:20:22.732250 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.732226 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c73bb32b-bb03-4a22-9ddd-60b15154a275-cert\") pod \"kserve-controller-manager-6fc5d867c5-mgfdv\" (UID: \"c73bb32b-bb03-4a22-9ddd-60b15154a275\") " pod="kserve/kserve-controller-manager-6fc5d867c5-mgfdv" Apr 23 18:20:22.739391 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.739366 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg85x\" (UniqueName: \"kubernetes.io/projected/c73bb32b-bb03-4a22-9ddd-60b15154a275-kube-api-access-dg85x\") pod \"kserve-controller-manager-6fc5d867c5-mgfdv\" (UID: \"c73bb32b-bb03-4a22-9ddd-60b15154a275\") " pod="kserve/kserve-controller-manager-6fc5d867c5-mgfdv" Apr 23 18:20:22.830357 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.830320 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx9hw\" (UniqueName: \"kubernetes.io/projected/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-kube-api-access-dx9hw\") pod \"d51eebd9-fc6d-4f64-8766-0c8b4d541c76\" (UID: \"d51eebd9-fc6d-4f64-8766-0c8b4d541c76\") " Apr 23 18:20:22.830586 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.830465 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-cert\") pod \"d51eebd9-fc6d-4f64-8766-0c8b4d541c76\" (UID: \"d51eebd9-fc6d-4f64-8766-0c8b4d541c76\") " Apr 23 18:20:22.832687 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.832660 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-cert" (OuterVolumeSpecName: "cert") pod "d51eebd9-fc6d-4f64-8766-0c8b4d541c76" (UID: "d51eebd9-fc6d-4f64-8766-0c8b4d541c76"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:20:22.832687 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.832673 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-kube-api-access-dx9hw" (OuterVolumeSpecName: "kube-api-access-dx9hw") pod "d51eebd9-fc6d-4f64-8766-0c8b4d541c76" (UID: "d51eebd9-fc6d-4f64-8766-0c8b4d541c76"). InnerVolumeSpecName "kube-api-access-dx9hw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:20:22.839952 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.839933 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-mgfdv" Apr 23 18:20:22.931293 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.931251 2581 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-cert\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 18:20:22.931293 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.931287 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dx9hw\" (UniqueName: \"kubernetes.io/projected/d51eebd9-fc6d-4f64-8766-0c8b4d541c76-kube-api-access-dx9hw\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 18:20:22.962819 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:22.962796 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-mgfdv"] Apr 23 18:20:22.964928 ip-10-0-143-218 kubenswrapper[2581]: W0423 18:20:22.964902 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc73bb32b_bb03_4a22_9ddd_60b15154a275.slice/crio-ad512c7bfecc521aa8d910a4d0608d021e4d4a7220f549e9b28ff0f91010b921 WatchSource:0}: Error finding container ad512c7bfecc521aa8d910a4d0608d021e4d4a7220f549e9b28ff0f91010b921: Status 404 returned error can't find the container with id ad512c7bfecc521aa8d910a4d0608d021e4d4a7220f549e9b28ff0f91010b921 Apr 23 18:20:23.279073 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:23.279038 2581 generic.go:358] "Generic (PLEG): container finished" podID="d51eebd9-fc6d-4f64-8766-0c8b4d541c76" containerID="84a6ac0d1d1d00b3ea71876ce8643846283148126d9a02a18d7702b3c0f7a49e" exitCode=0 Apr 23 18:20:23.279169 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:23.279105 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" Apr 23 18:20:23.279169 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:23.279112 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" event={"ID":"d51eebd9-fc6d-4f64-8766-0c8b4d541c76","Type":"ContainerDied","Data":"84a6ac0d1d1d00b3ea71876ce8643846283148126d9a02a18d7702b3c0f7a49e"} Apr 23 18:20:23.279169 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:23.279156 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-rx7vk" event={"ID":"d51eebd9-fc6d-4f64-8766-0c8b4d541c76","Type":"ContainerDied","Data":"de2d0d57038488eb6fa307b5d9f8c4d58452c57896d0a3f4165095f89ffff25c"} Apr 23 18:20:23.279321 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:23.279173 2581 scope.go:117] "RemoveContainer" containerID="84a6ac0d1d1d00b3ea71876ce8643846283148126d9a02a18d7702b3c0f7a49e" Apr 23 18:20:23.280313 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:23.280293 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-mgfdv" event={"ID":"c73bb32b-bb03-4a22-9ddd-60b15154a275","Type":"ContainerStarted","Data":"ad512c7bfecc521aa8d910a4d0608d021e4d4a7220f549e9b28ff0f91010b921"} Apr 23 18:20:23.287572 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:23.287553 2581 scope.go:117] "RemoveContainer" containerID="84a6ac0d1d1d00b3ea71876ce8643846283148126d9a02a18d7702b3c0f7a49e" Apr 23 18:20:23.287835 ip-10-0-143-218 kubenswrapper[2581]: E0423 18:20:23.287816 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84a6ac0d1d1d00b3ea71876ce8643846283148126d9a02a18d7702b3c0f7a49e\": container with ID starting with 84a6ac0d1d1d00b3ea71876ce8643846283148126d9a02a18d7702b3c0f7a49e not found: ID does not exist" containerID="84a6ac0d1d1d00b3ea71876ce8643846283148126d9a02a18d7702b3c0f7a49e" Apr 23 18:20:23.287897 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:23.287845 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84a6ac0d1d1d00b3ea71876ce8643846283148126d9a02a18d7702b3c0f7a49e"} err="failed to get container status \"84a6ac0d1d1d00b3ea71876ce8643846283148126d9a02a18d7702b3c0f7a49e\": rpc error: code = NotFound desc = could not find container \"84a6ac0d1d1d00b3ea71876ce8643846283148126d9a02a18d7702b3c0f7a49e\": container with ID starting with 84a6ac0d1d1d00b3ea71876ce8643846283148126d9a02a18d7702b3c0f7a49e not found: ID does not exist" Apr 23 18:20:23.322134 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:23.322117 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-rx7vk"] Apr 23 18:20:23.327907 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:23.327886 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-rx7vk"] Apr 23 18:20:23.790957 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:23.790923 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d51eebd9-fc6d-4f64-8766-0c8b4d541c76" path="/var/lib/kubelet/pods/d51eebd9-fc6d-4f64-8766-0c8b4d541c76/volumes" Apr 23 18:20:24.284317 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:24.284277 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-mgfdv" event={"ID":"c73bb32b-bb03-4a22-9ddd-60b15154a275","Type":"ContainerStarted","Data":"59e544c978208abdbafe555ed0e2e69a588548341f61f71a123c188452036df7"} Apr 23 18:20:24.284522 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:24.284354 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6fc5d867c5-mgfdv" Apr 23 18:20:24.303546 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:24.303503 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6fc5d867c5-mgfdv" podStartSLOduration=1.999827334 podStartE2EDuration="2.303489769s" podCreationTimestamp="2026-04-23 18:20:22 +0000 UTC" firstStartedPulling="2026-04-23 18:20:22.966238838 +0000 UTC m=+2341.715680617" lastFinishedPulling="2026-04-23 18:20:23.269901273 +0000 UTC m=+2342.019343052" observedRunningTime="2026-04-23 18:20:24.302719949 +0000 UTC m=+2343.052161753" watchObservedRunningTime="2026-04-23 18:20:24.303489769 +0000 UTC m=+2343.052931569" Apr 23 18:20:55.293665 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:55.293629 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6fc5d867c5-mgfdv" Apr 23 18:20:56.204507 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.204474 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-kptkb"] Apr 23 18:20:56.204805 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.204793 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d51eebd9-fc6d-4f64-8766-0c8b4d541c76" containerName="manager" Apr 23 18:20:56.204857 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.204808 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d51eebd9-fc6d-4f64-8766-0c8b4d541c76" containerName="manager" Apr 23 18:20:56.204893 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.204877 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="d51eebd9-fc6d-4f64-8766-0c8b4d541c76" containerName="manager" Apr 23 18:20:56.207624 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.207609 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-kptkb" Apr 23 18:20:56.210799 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.210776 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 23 18:20:56.211044 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.211025 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-r5fnb\"" Apr 23 18:20:56.220740 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.220715 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-kptkb"] Apr 23 18:20:56.303407 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.303375 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ecb4f8-7561-4652-beab-d1fc66645999-cert\") pod \"odh-model-controller-696fc77849-kptkb\" (UID: \"40ecb4f8-7561-4652-beab-d1fc66645999\") " pod="kserve/odh-model-controller-696fc77849-kptkb" Apr 23 18:20:56.303759 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.303410 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d659t\" (UniqueName: \"kubernetes.io/projected/40ecb4f8-7561-4652-beab-d1fc66645999-kube-api-access-d659t\") pod \"odh-model-controller-696fc77849-kptkb\" (UID: \"40ecb4f8-7561-4652-beab-d1fc66645999\") " pod="kserve/odh-model-controller-696fc77849-kptkb" Apr 23 18:20:56.404860 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.404831 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ecb4f8-7561-4652-beab-d1fc66645999-cert\") pod \"odh-model-controller-696fc77849-kptkb\" (UID: \"40ecb4f8-7561-4652-beab-d1fc66645999\") " pod="kserve/odh-model-controller-696fc77849-kptkb" Apr 23 18:20:56.404860 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.404864 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d659t\" (UniqueName: \"kubernetes.io/projected/40ecb4f8-7561-4652-beab-d1fc66645999-kube-api-access-d659t\") pod \"odh-model-controller-696fc77849-kptkb\" (UID: \"40ecb4f8-7561-4652-beab-d1fc66645999\") " pod="kserve/odh-model-controller-696fc77849-kptkb" Apr 23 18:20:56.407335 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.407305 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/40ecb4f8-7561-4652-beab-d1fc66645999-cert\") pod \"odh-model-controller-696fc77849-kptkb\" (UID: \"40ecb4f8-7561-4652-beab-d1fc66645999\") " pod="kserve/odh-model-controller-696fc77849-kptkb" Apr 23 18:20:56.415446 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.415423 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d659t\" (UniqueName: \"kubernetes.io/projected/40ecb4f8-7561-4652-beab-d1fc66645999-kube-api-access-d659t\") pod \"odh-model-controller-696fc77849-kptkb\" (UID: \"40ecb4f8-7561-4652-beab-d1fc66645999\") " pod="kserve/odh-model-controller-696fc77849-kptkb" Apr 23 18:20:56.517724 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.517643 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-kptkb" Apr 23 18:20:56.643534 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:56.643458 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-kptkb"] Apr 23 18:20:57.394843 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:57.394790 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-kptkb" event={"ID":"40ecb4f8-7561-4652-beab-d1fc66645999","Type":"ContainerStarted","Data":"b5720683a812e2825f4a4dddde6424aee08ff8ffbbb19baa1ff1f81bd9d889ed"} Apr 23 18:20:59.404178 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:59.404137 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-kptkb" event={"ID":"40ecb4f8-7561-4652-beab-d1fc66645999","Type":"ContainerStarted","Data":"0cab4ed6fac1e5bbede84413f19ad05a21dab5266fcca9f8e916cf10c4d63424"} Apr 23 18:20:59.404556 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:59.404353 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-kptkb" Apr 23 18:20:59.426490 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:20:59.426437 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-kptkb" podStartSLOduration=0.825987761 podStartE2EDuration="3.426423617s" podCreationTimestamp="2026-04-23 18:20:56 +0000 UTC" firstStartedPulling="2026-04-23 18:20:56.648497013 +0000 UTC m=+2375.397938794" lastFinishedPulling="2026-04-23 18:20:59.248932871 +0000 UTC m=+2377.998374650" observedRunningTime="2026-04-23 18:20:59.425671162 +0000 UTC m=+2378.175112962" watchObservedRunningTime="2026-04-23 18:20:59.426423617 +0000 UTC m=+2378.175865467" Apr 23 18:21:10.410368 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:10.410336 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-kptkb" Apr 23 18:21:39.344646 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.344558 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f"] Apr 23 18:21:39.351690 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.351666 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.354674 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.354647 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:21:39.354674 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.354662 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-78dl4\"" Apr 23 18:21:39.354858 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.354659 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:21:39.356622 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.356602 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 23 18:21:39.358790 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.358770 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f"] Apr 23 18:21:39.473847 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.473811 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-home\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.474030 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.473901 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-dshm\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.474030 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.473934 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-model-cache\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.474030 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.473962 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-tls-certs\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.474030 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.474020 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.474177 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.474072 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msn94\" (UniqueName: \"kubernetes.io/projected/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-kube-api-access-msn94\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.474177 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.474126 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-tmp-dir\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.575479 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.575442 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-home\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.575664 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.575524 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-dshm\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.575664 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.575550 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-model-cache\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.575664 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.575578 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-tls-certs\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.575833 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.575703 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.575833 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.575752 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msn94\" (UniqueName: \"kubernetes.io/projected/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-kube-api-access-msn94\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.575833 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.575797 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-tmp-dir\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.576065 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.576043 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-home\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.576156 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.576133 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.576215 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.576143 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-tmp-dir\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.576383 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.576330 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-model-cache\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.578217 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.578196 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-tls-certs\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.578333 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.578223 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-dshm\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.584837 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.584815 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msn94\" (UniqueName: \"kubernetes.io/projected/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-kube-api-access-msn94\") pod \"scheduler-inline-config-test-kserve-78b8847666-26b5f\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.663584 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.663560 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:39.794144 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:39.794116 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f"] Apr 23 18:21:39.797181 ip-10-0-143-218 kubenswrapper[2581]: W0423 18:21:39.797113 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb2cd28d_0212_40d8_84e9_2c0cf44c45a6.slice/crio-08b2ba903014799a346d1f676d8c4277c93d960be3d57c353453bd11a2d1822c WatchSource:0}: Error finding container 08b2ba903014799a346d1f676d8c4277c93d960be3d57c353453bd11a2d1822c: Status 404 returned error can't find the container with id 08b2ba903014799a346d1f676d8c4277c93d960be3d57c353453bd11a2d1822c Apr 23 18:21:40.542016 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:40.541966 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" event={"ID":"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6","Type":"ContainerStarted","Data":"08b2ba903014799a346d1f676d8c4277c93d960be3d57c353453bd11a2d1822c"} Apr 23 18:21:44.556377 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:44.556340 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" event={"ID":"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6","Type":"ContainerStarted","Data":"f45eda0eca9029d4d853af9dab6a952e619d33d48111c23dfbf98221a62f748c"} Apr 23 18:21:48.572972 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:48.572916 2581 generic.go:358] "Generic (PLEG): container finished" podID="bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" containerID="f45eda0eca9029d4d853af9dab6a952e619d33d48111c23dfbf98221a62f748c" exitCode=0 Apr 23 18:21:48.573449 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:48.573020 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" event={"ID":"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6","Type":"ContainerDied","Data":"f45eda0eca9029d4d853af9dab6a952e619d33d48111c23dfbf98221a62f748c"} Apr 23 18:21:50.581872 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:50.581833 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" event={"ID":"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6","Type":"ContainerStarted","Data":"61e30f5ff2dd8e03bb300b8479a06bab9a48949aa2daa75f33c934db35139640"} Apr 23 18:21:50.602416 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:50.602370 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" podStartSLOduration=1.733695901 podStartE2EDuration="11.602356253s" podCreationTimestamp="2026-04-23 18:21:39 +0000 UTC" firstStartedPulling="2026-04-23 18:21:39.799528454 +0000 UTC m=+2418.548970234" lastFinishedPulling="2026-04-23 18:21:49.668188804 +0000 UTC m=+2428.417630586" observedRunningTime="2026-04-23 18:21:50.601307053 +0000 UTC m=+2429.350748854" watchObservedRunningTime="2026-04-23 18:21:50.602356253 +0000 UTC m=+2429.351798051" Apr 23 18:21:59.664443 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:59.664403 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:59.664965 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:59.664459 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:21:59.677141 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:21:59.677113 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:22:00.623940 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:22:00.623908 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:40:56.848333 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:56.848300 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f"] Apr 23 18:40:56.848851 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:56.848595 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" podUID="bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" containerName="main" containerID="cri-o://61e30f5ff2dd8e03bb300b8479a06bab9a48949aa2daa75f33c934db35139640" gracePeriod=30 Apr 23 18:40:57.102250 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.102194 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:40:57.203556 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.203506 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msn94\" (UniqueName: \"kubernetes.io/projected/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-kube-api-access-msn94\") pod \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " Apr 23 18:40:57.203773 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.203565 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-tmp-dir\") pod \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " Apr 23 18:40:57.203773 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.203617 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-home\") pod \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " Apr 23 18:40:57.203773 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.203645 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-kserve-provision-location\") pod \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " Apr 23 18:40:57.203773 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.203683 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-dshm\") pod \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " Apr 23 18:40:57.203773 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.203712 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-model-cache\") pod \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " Apr 23 18:40:57.203773 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.203735 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-tls-certs\") pod \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\" (UID: \"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6\") " Apr 23 18:40:57.204161 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.203848 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" (UID: "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:40:57.204161 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.203916 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-home" (OuterVolumeSpecName: "home") pod "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" (UID: "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:40:57.204161 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.204085 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-model-cache" (OuterVolumeSpecName: "model-cache") pod "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" (UID: "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:40:57.204161 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.204141 2581 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-home\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 18:40:57.204161 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.204162 2581 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-tmp-dir\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 18:40:57.206173 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.206149 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-kube-api-access-msn94" (OuterVolumeSpecName: "kube-api-access-msn94") pod "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" (UID: "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6"). InnerVolumeSpecName "kube-api-access-msn94". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:40:57.206266 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.206215 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-dshm" (OuterVolumeSpecName: "dshm") pod "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" (UID: "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:40:57.206351 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.206330 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" (UID: "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:40:57.229574 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.229539 2581 generic.go:358] "Generic (PLEG): container finished" podID="bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" containerID="61e30f5ff2dd8e03bb300b8479a06bab9a48949aa2daa75f33c934db35139640" exitCode=0 Apr 23 18:40:57.229693 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.229618 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" Apr 23 18:40:57.229693 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.229618 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" event={"ID":"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6","Type":"ContainerDied","Data":"61e30f5ff2dd8e03bb300b8479a06bab9a48949aa2daa75f33c934db35139640"} Apr 23 18:40:57.229693 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.229663 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f" event={"ID":"bb2cd28d-0212-40d8-84e9-2c0cf44c45a6","Type":"ContainerDied","Data":"08b2ba903014799a346d1f676d8c4277c93d960be3d57c353453bd11a2d1822c"} Apr 23 18:40:57.229693 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.229680 2581 scope.go:117] "RemoveContainer" containerID="61e30f5ff2dd8e03bb300b8479a06bab9a48949aa2daa75f33c934db35139640" Apr 23 18:40:57.237902 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.237881 2581 scope.go:117] "RemoveContainer" containerID="f45eda0eca9029d4d853af9dab6a952e619d33d48111c23dfbf98221a62f748c" Apr 23 18:40:57.262778 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.262742 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" (UID: "bb2cd28d-0212-40d8-84e9-2c0cf44c45a6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:40:57.299707 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.299686 2581 scope.go:117] "RemoveContainer" containerID="61e30f5ff2dd8e03bb300b8479a06bab9a48949aa2daa75f33c934db35139640" Apr 23 18:40:57.300181 ip-10-0-143-218 kubenswrapper[2581]: E0423 18:40:57.300152 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61e30f5ff2dd8e03bb300b8479a06bab9a48949aa2daa75f33c934db35139640\": container with ID starting with 61e30f5ff2dd8e03bb300b8479a06bab9a48949aa2daa75f33c934db35139640 not found: ID does not exist" containerID="61e30f5ff2dd8e03bb300b8479a06bab9a48949aa2daa75f33c934db35139640" Apr 23 18:40:57.300293 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.300186 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e30f5ff2dd8e03bb300b8479a06bab9a48949aa2daa75f33c934db35139640"} err="failed to get container status \"61e30f5ff2dd8e03bb300b8479a06bab9a48949aa2daa75f33c934db35139640\": rpc error: code = NotFound desc = could not find container \"61e30f5ff2dd8e03bb300b8479a06bab9a48949aa2daa75f33c934db35139640\": container with ID starting with 61e30f5ff2dd8e03bb300b8479a06bab9a48949aa2daa75f33c934db35139640 not found: ID does not exist" Apr 23 18:40:57.300293 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.300207 2581 scope.go:117] "RemoveContainer" containerID="f45eda0eca9029d4d853af9dab6a952e619d33d48111c23dfbf98221a62f748c" Apr 23 18:40:57.300477 ip-10-0-143-218 kubenswrapper[2581]: E0423 18:40:57.300453 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f45eda0eca9029d4d853af9dab6a952e619d33d48111c23dfbf98221a62f748c\": container with ID starting with f45eda0eca9029d4d853af9dab6a952e619d33d48111c23dfbf98221a62f748c not found: ID does not exist" containerID="f45eda0eca9029d4d853af9dab6a952e619d33d48111c23dfbf98221a62f748c" Apr 23 18:40:57.300526 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.300485 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f45eda0eca9029d4d853af9dab6a952e619d33d48111c23dfbf98221a62f748c"} err="failed to get container status \"f45eda0eca9029d4d853af9dab6a952e619d33d48111c23dfbf98221a62f748c\": rpc error: code = NotFound desc = could not find container \"f45eda0eca9029d4d853af9dab6a952e619d33d48111c23dfbf98221a62f748c\": container with ID starting with f45eda0eca9029d4d853af9dab6a952e619d33d48111c23dfbf98221a62f748c not found: ID does not exist" Apr 23 18:40:57.305048 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.305031 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-msn94\" (UniqueName: \"kubernetes.io/projected/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-kube-api-access-msn94\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 18:40:57.305108 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.305050 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-kserve-provision-location\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 18:40:57.305108 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.305061 2581 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-dshm\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 18:40:57.305108 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.305070 2581 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-model-cache\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 18:40:57.305108 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.305081 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6-tls-certs\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 18:40:57.552238 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.552210 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f"] Apr 23 18:40:57.558899 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.558870 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-78b8847666-26b5f"] Apr 23 18:40:57.791084 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:40:57.790977 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" path="/var/lib/kubelet/pods/bb2cd28d-0212-40d8-84e9-2c0cf44c45a6/volumes" Apr 23 18:41:06.124005 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.123955 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54"] Apr 23 18:41:06.124460 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.124439 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" containerName="storage-initializer" Apr 23 18:41:06.124538 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.124465 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" containerName="storage-initializer" Apr 23 18:41:06.124538 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.124488 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" containerName="main" Apr 23 18:41:06.124538 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.124497 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" containerName="main" Apr 23 18:41:06.124686 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.124587 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb2cd28d-0212-40d8-84e9-2c0cf44c45a6" containerName="main" Apr 23 18:41:06.127833 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.127810 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.132192 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.132168 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-78dl4\"" Apr 23 18:41:06.132285 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.132190 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:41:06.132285 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.132206 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:41:06.132285 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.132173 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 23 18:41:06.140420 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.140350 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54"] Apr 23 18:41:06.180417 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.180392 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-model-cache\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.180583 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.180424 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-dshm\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.180583 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.180443 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.180583 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.180476 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-home\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.180583 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.180526 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.180713 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.180594 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpn7p\" (UniqueName: \"kubernetes.io/projected/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-kube-api-access-gpn7p\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.180713 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.180620 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.281055 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.281024 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.281238 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.281074 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpn7p\" (UniqueName: \"kubernetes.io/projected/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-kube-api-access-gpn7p\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.281238 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.281092 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.281238 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.281124 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-model-cache\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.281238 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.281146 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-dshm\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.281238 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.281162 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.281238 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.281192 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-home\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.281560 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.281534 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.281615 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.281557 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-model-cache\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.281678 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.281655 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.281713 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.281682 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-home\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.283616 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.283594 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-dshm\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.283817 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.283797 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.292379 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.292358 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpn7p\" (UniqueName: \"kubernetes.io/projected/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-kube-api-access-gpn7p\") pod \"scheduler-configmap-ref-test-kserve-88df5d794-kmp54\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.437724 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.437692 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:06.567918 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.567740 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54"] Apr 23 18:41:06.570830 ip-10-0-143-218 kubenswrapper[2581]: W0423 18:41:06.570801 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda61c01c1_5ff8_442b_8e97_5a9d4e446af0.slice/crio-cbbd6e65e59716d7bbbd9d8b83703cc8d43b22ed81160b493be0d2cac7cad1ae WatchSource:0}: Error finding container cbbd6e65e59716d7bbbd9d8b83703cc8d43b22ed81160b493be0d2cac7cad1ae: Status 404 returned error can't find the container with id cbbd6e65e59716d7bbbd9d8b83703cc8d43b22ed81160b493be0d2cac7cad1ae Apr 23 18:41:06.573043 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:06.573025 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:41:07.263954 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:07.263922 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" event={"ID":"a61c01c1-5ff8-442b-8e97-5a9d4e446af0","Type":"ContainerStarted","Data":"87bd367c79e2f47458db6744f4f4de3fb104b5a9f623f4bc25d3db0c208b1acc"} Apr 23 18:41:07.263954 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:07.263958 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" event={"ID":"a61c01c1-5ff8-442b-8e97-5a9d4e446af0","Type":"ContainerStarted","Data":"cbbd6e65e59716d7bbbd9d8b83703cc8d43b22ed81160b493be0d2cac7cad1ae"} Apr 23 18:41:11.277857 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:11.277812 2581 generic.go:358] "Generic (PLEG): container finished" podID="a61c01c1-5ff8-442b-8e97-5a9d4e446af0" containerID="87bd367c79e2f47458db6744f4f4de3fb104b5a9f623f4bc25d3db0c208b1acc" exitCode=0 Apr 23 18:41:11.278423 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:11.277871 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" event={"ID":"a61c01c1-5ff8-442b-8e97-5a9d4e446af0","Type":"ContainerDied","Data":"87bd367c79e2f47458db6744f4f4de3fb104b5a9f623f4bc25d3db0c208b1acc"} Apr 23 18:41:12.282173 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:12.282139 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" event={"ID":"a61c01c1-5ff8-442b-8e97-5a9d4e446af0","Type":"ContainerStarted","Data":"37c4b8f3bebc0d6858d6d44bb75f63de153641d60010640595c62cf40ada8f17"} Apr 23 18:41:12.306735 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:12.306685 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" podStartSLOduration=6.306670518 podStartE2EDuration="6.306670518s" podCreationTimestamp="2026-04-23 18:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:41:12.305113573 +0000 UTC m=+3591.054555368" watchObservedRunningTime="2026-04-23 18:41:12.306670518 +0000 UTC m=+3591.056112346" Apr 23 18:41:16.438805 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:16.438774 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:16.439236 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:16.438862 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:16.451196 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:16.451176 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 18:41:17.310451 ip-10-0-143-218 kubenswrapper[2581]: I0423 18:41:17.310425 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 19:00:03.967870 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:03.967832 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54"] Apr 23 19:00:03.968483 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:03.968219 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" podUID="a61c01c1-5ff8-442b-8e97-5a9d4e446af0" containerName="main" containerID="cri-o://37c4b8f3bebc0d6858d6d44bb75f63de153641d60010640595c62cf40ada8f17" gracePeriod=30 Apr 23 19:00:04.206912 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.206888 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 19:00:04.329016 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.328911 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-model-cache\") pod \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " Apr 23 19:00:04.329016 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.328958 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-home\") pod \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " Apr 23 19:00:04.329309 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.329020 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-kserve-provision-location\") pod \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " Apr 23 19:00:04.329309 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.329046 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-tls-certs\") pod \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " Apr 23 19:00:04.329309 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.329072 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpn7p\" (UniqueName: \"kubernetes.io/projected/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-kube-api-access-gpn7p\") pod \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " Apr 23 19:00:04.329309 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.329100 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-dshm\") pod \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " Apr 23 19:00:04.329309 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.329136 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-tmp-dir\") pod \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\" (UID: \"a61c01c1-5ff8-442b-8e97-5a9d4e446af0\") " Apr 23 19:00:04.329309 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.329253 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-model-cache" (OuterVolumeSpecName: "model-cache") pod "a61c01c1-5ff8-442b-8e97-5a9d4e446af0" (UID: "a61c01c1-5ff8-442b-8e97-5a9d4e446af0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:04.329625 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.329408 2581 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-model-cache\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:00:04.329625 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.329464 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "a61c01c1-5ff8-442b-8e97-5a9d4e446af0" (UID: "a61c01c1-5ff8-442b-8e97-5a9d4e446af0"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:04.329625 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.329545 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-home" (OuterVolumeSpecName: "home") pod "a61c01c1-5ff8-442b-8e97-5a9d4e446af0" (UID: "a61c01c1-5ff8-442b-8e97-5a9d4e446af0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:04.331296 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.331272 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a61c01c1-5ff8-442b-8e97-5a9d4e446af0" (UID: "a61c01c1-5ff8-442b-8e97-5a9d4e446af0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:00:04.331794 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.331768 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-dshm" (OuterVolumeSpecName: "dshm") pod "a61c01c1-5ff8-442b-8e97-5a9d4e446af0" (UID: "a61c01c1-5ff8-442b-8e97-5a9d4e446af0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:04.331883 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.331770 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-kube-api-access-gpn7p" (OuterVolumeSpecName: "kube-api-access-gpn7p") pod "a61c01c1-5ff8-442b-8e97-5a9d4e446af0" (UID: "a61c01c1-5ff8-442b-8e97-5a9d4e446af0"). InnerVolumeSpecName "kube-api-access-gpn7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:00:04.391727 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.391681 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a61c01c1-5ff8-442b-8e97-5a9d4e446af0" (UID: "a61c01c1-5ff8-442b-8e97-5a9d4e446af0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:04.430101 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.430064 2581 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-home\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:00:04.430101 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.430091 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-kserve-provision-location\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:00:04.430101 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.430103 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-tls-certs\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:00:04.430313 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.430118 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gpn7p\" (UniqueName: \"kubernetes.io/projected/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-kube-api-access-gpn7p\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:00:04.430313 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.430128 2581 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-dshm\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:00:04.430313 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.430136 2581 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a61c01c1-5ff8-442b-8e97-5a9d4e446af0-tmp-dir\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:00:04.894226 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.894186 2581 generic.go:358] "Generic (PLEG): container finished" podID="a61c01c1-5ff8-442b-8e97-5a9d4e446af0" containerID="37c4b8f3bebc0d6858d6d44bb75f63de153641d60010640595c62cf40ada8f17" exitCode=0 Apr 23 19:00:04.894397 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.894247 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" Apr 23 19:00:04.894397 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.894273 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" event={"ID":"a61c01c1-5ff8-442b-8e97-5a9d4e446af0","Type":"ContainerDied","Data":"37c4b8f3bebc0d6858d6d44bb75f63de153641d60010640595c62cf40ada8f17"} Apr 23 19:00:04.894397 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.894305 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54" event={"ID":"a61c01c1-5ff8-442b-8e97-5a9d4e446af0","Type":"ContainerDied","Data":"cbbd6e65e59716d7bbbd9d8b83703cc8d43b22ed81160b493be0d2cac7cad1ae"} Apr 23 19:00:04.894397 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.894321 2581 scope.go:117] "RemoveContainer" containerID="37c4b8f3bebc0d6858d6d44bb75f63de153641d60010640595c62cf40ada8f17" Apr 23 19:00:04.903422 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.903406 2581 scope.go:117] "RemoveContainer" containerID="87bd367c79e2f47458db6744f4f4de3fb104b5a9f623f4bc25d3db0c208b1acc" Apr 23 19:00:04.916703 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.916677 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54"] Apr 23 19:00:04.918734 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.918717 2581 scope.go:117] "RemoveContainer" containerID="37c4b8f3bebc0d6858d6d44bb75f63de153641d60010640595c62cf40ada8f17" Apr 23 19:00:04.918870 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.918853 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-88df5d794-kmp54"] Apr 23 19:00:04.919075 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:00:04.919057 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37c4b8f3bebc0d6858d6d44bb75f63de153641d60010640595c62cf40ada8f17\": container with ID starting with 37c4b8f3bebc0d6858d6d44bb75f63de153641d60010640595c62cf40ada8f17 not found: ID does not exist" containerID="37c4b8f3bebc0d6858d6d44bb75f63de153641d60010640595c62cf40ada8f17" Apr 23 19:00:04.919142 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.919083 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37c4b8f3bebc0d6858d6d44bb75f63de153641d60010640595c62cf40ada8f17"} err="failed to get container status \"37c4b8f3bebc0d6858d6d44bb75f63de153641d60010640595c62cf40ada8f17\": rpc error: code = NotFound desc = could not find container \"37c4b8f3bebc0d6858d6d44bb75f63de153641d60010640595c62cf40ada8f17\": container with ID starting with 37c4b8f3bebc0d6858d6d44bb75f63de153641d60010640595c62cf40ada8f17 not found: ID does not exist" Apr 23 19:00:04.919142 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.919102 2581 scope.go:117] "RemoveContainer" containerID="87bd367c79e2f47458db6744f4f4de3fb104b5a9f623f4bc25d3db0c208b1acc" Apr 23 19:00:04.919343 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:00:04.919327 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87bd367c79e2f47458db6744f4f4de3fb104b5a9f623f4bc25d3db0c208b1acc\": container with ID starting with 87bd367c79e2f47458db6744f4f4de3fb104b5a9f623f4bc25d3db0c208b1acc not found: ID does not exist" containerID="87bd367c79e2f47458db6744f4f4de3fb104b5a9f623f4bc25d3db0c208b1acc" Apr 23 19:00:04.919387 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:04.919349 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87bd367c79e2f47458db6744f4f4de3fb104b5a9f623f4bc25d3db0c208b1acc"} err="failed to get container status \"87bd367c79e2f47458db6744f4f4de3fb104b5a9f623f4bc25d3db0c208b1acc\": rpc error: code = NotFound desc = could not find container \"87bd367c79e2f47458db6744f4f4de3fb104b5a9f623f4bc25d3db0c208b1acc\": container with ID starting with 87bd367c79e2f47458db6744f4f4de3fb104b5a9f623f4bc25d3db0c208b1acc not found: ID does not exist" Apr 23 19:00:05.790844 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:05.790811 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a61c01c1-5ff8-442b-8e97-5a9d4e446af0" path="/var/lib/kubelet/pods/a61c01c1-5ff8-442b-8e97-5a9d4e446af0/volumes" Apr 23 19:00:21.548540 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.548508 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q"] Apr 23 19:00:21.549171 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.549001 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a61c01c1-5ff8-442b-8e97-5a9d4e446af0" containerName="storage-initializer" Apr 23 19:00:21.549171 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.549021 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61c01c1-5ff8-442b-8e97-5a9d4e446af0" containerName="storage-initializer" Apr 23 19:00:21.549171 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.549044 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a61c01c1-5ff8-442b-8e97-5a9d4e446af0" containerName="main" Apr 23 19:00:21.549171 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.549075 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61c01c1-5ff8-442b-8e97-5a9d4e446af0" containerName="main" Apr 23 19:00:21.549363 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.549185 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="a61c01c1-5ff8-442b-8e97-5a9d4e446af0" containerName="main" Apr 23 19:00:21.552560 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.552537 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.555431 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.555410 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 19:00:21.555536 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.555466 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-78dl4\"" Apr 23 19:00:21.556744 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.556725 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 19:00:21.556847 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.556766 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 23 19:00:21.562269 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.562249 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q"] Apr 23 19:00:21.571472 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.571446 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.571573 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.571486 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-home\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.571573 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.571526 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.571656 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.571635 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-dshm\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.571702 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.571677 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-model-cache\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.571738 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.571710 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a1d211b9-e371-4f1f-ad81-0759309624e1-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.571774 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.571745 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ppvw\" (UniqueName: \"kubernetes.io/projected/a1d211b9-e371-4f1f-ad81-0759309624e1-kube-api-access-5ppvw\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.672338 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.672300 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-dshm\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.672338 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.672336 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-model-cache\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.672544 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.672357 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a1d211b9-e371-4f1f-ad81-0759309624e1-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.672544 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.672377 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ppvw\" (UniqueName: \"kubernetes.io/projected/a1d211b9-e371-4f1f-ad81-0759309624e1-kube-api-access-5ppvw\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.672544 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.672494 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.672674 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.672546 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-home\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.672674 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.672611 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.672842 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.672811 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-model-cache\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.672842 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.672836 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.673036 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.673010 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-home\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.673083 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.673066 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.674713 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.674693 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-dshm\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.674944 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.674927 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a1d211b9-e371-4f1f-ad81-0759309624e1-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.681841 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.681818 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ppvw\" (UniqueName: \"kubernetes.io/projected/a1d211b9-e371-4f1f-ad81-0759309624e1-kube-api-access-5ppvw\") pod \"scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.834724 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.834641 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s"] Apr 23 19:00:21.838430 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.838408 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.841332 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.841312 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-ltz4s\"" Apr 23 19:00:21.849188 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.849165 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s"] Apr 23 19:00:21.867907 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.867883 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-78dl4\"" Apr 23 19:00:21.874320 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.874299 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de33bead-ac0e-480d-997e-c66803a4d14c-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.874427 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.874330 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.874427 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.874359 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmss7\" (UniqueName: \"kubernetes.io/projected/de33bead-ac0e-480d-997e-c66803a4d14c-kube-api-access-gmss7\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.874525 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.874495 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.874574 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.874528 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.874574 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.874554 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.875230 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.875206 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:21.975143 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.975109 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de33bead-ac0e-480d-997e-c66803a4d14c-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.975368 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.975347 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.975496 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.975481 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmss7\" (UniqueName: \"kubernetes.io/projected/de33bead-ac0e-480d-997e-c66803a4d14c-kube-api-access-gmss7\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.975662 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.975643 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.975738 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.975683 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.975738 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.975721 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.975849 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.975756 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.976123 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.976102 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.976461 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.976208 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.976592 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.976375 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.978958 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.978933 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de33bead-ac0e-480d-997e-c66803a4d14c-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:21.984977 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:21.984949 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmss7\" (UniqueName: \"kubernetes.io/projected/de33bead-ac0e-480d-997e-c66803a4d14c-kube-api-access-gmss7\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:22.006378 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:22.006240 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q"] Apr 23 19:00:22.009158 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:00:22.009131 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1d211b9_e371_4f1f_ad81_0759309624e1.slice/crio-a9ec032ccb2db40655b2bc615ce8dac14a9a95982976912fd6a81f33ce2ed56d WatchSource:0}: Error finding container a9ec032ccb2db40655b2bc615ce8dac14a9a95982976912fd6a81f33ce2ed56d: Status 404 returned error can't find the container with id a9ec032ccb2db40655b2bc615ce8dac14a9a95982976912fd6a81f33ce2ed56d Apr 23 19:00:22.011022 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:22.011005 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 19:00:22.148489 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:22.148462 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:00:22.288335 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:22.288296 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s"] Apr 23 19:00:22.289471 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:00:22.289445 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde33bead_ac0e_480d_997e_c66803a4d14c.slice/crio-488f1af3d02378a197dafad737dac6ef7b181f79cbd5f10efe3fd2c33dea698c WatchSource:0}: Error finding container 488f1af3d02378a197dafad737dac6ef7b181f79cbd5f10efe3fd2c33dea698c: Status 404 returned error can't find the container with id 488f1af3d02378a197dafad737dac6ef7b181f79cbd5f10efe3fd2c33dea698c Apr 23 19:00:22.957692 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:22.957628 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" event={"ID":"de33bead-ac0e-480d-997e-c66803a4d14c","Type":"ContainerStarted","Data":"6033134c3157771e566bdfb66493d297ede9a35b3b8b88895f8c014dffc20855"} Apr 23 19:00:22.957692 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:22.957674 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" event={"ID":"de33bead-ac0e-480d-997e-c66803a4d14c","Type":"ContainerStarted","Data":"488f1af3d02378a197dafad737dac6ef7b181f79cbd5f10efe3fd2c33dea698c"} Apr 23 19:00:22.959227 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:22.959200 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" event={"ID":"a1d211b9-e371-4f1f-ad81-0759309624e1","Type":"ContainerStarted","Data":"4d020937c81b1d5dc7b7b0ad9a4e3acb25025cc2d5be89ba095d41d1f5ab8558"} Apr 23 19:00:22.959349 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:22.959233 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" event={"ID":"a1d211b9-e371-4f1f-ad81-0759309624e1","Type":"ContainerStarted","Data":"a9ec032ccb2db40655b2bc615ce8dac14a9a95982976912fd6a81f33ce2ed56d"} Apr 23 19:00:23.963739 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:23.963701 2581 generic.go:358] "Generic (PLEG): container finished" podID="de33bead-ac0e-480d-997e-c66803a4d14c" containerID="6033134c3157771e566bdfb66493d297ede9a35b3b8b88895f8c014dffc20855" exitCode=0 Apr 23 19:00:23.964209 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:23.963782 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" event={"ID":"de33bead-ac0e-480d-997e-c66803a4d14c","Type":"ContainerDied","Data":"6033134c3157771e566bdfb66493d297ede9a35b3b8b88895f8c014dffc20855"} Apr 23 19:00:26.975127 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:26.975098 2581 generic.go:358] "Generic (PLEG): container finished" podID="a1d211b9-e371-4f1f-ad81-0759309624e1" containerID="4d020937c81b1d5dc7b7b0ad9a4e3acb25025cc2d5be89ba095d41d1f5ab8558" exitCode=0 Apr 23 19:00:26.975558 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:26.975132 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" event={"ID":"a1d211b9-e371-4f1f-ad81-0759309624e1","Type":"ContainerDied","Data":"4d020937c81b1d5dc7b7b0ad9a4e3acb25025cc2d5be89ba095d41d1f5ab8558"} Apr 23 19:00:27.980075 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:27.980034 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" event={"ID":"a1d211b9-e371-4f1f-ad81-0759309624e1","Type":"ContainerStarted","Data":"ccd830c291ad6b7f7001fd8183a0d89c04e9abca3eeeb9b92657f287c410ca21"} Apr 23 19:00:28.002058 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:28.002004 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" podStartSLOduration=7.001970245 podStartE2EDuration="7.001970245s" podCreationTimestamp="2026-04-23 19:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:00:28.000670395 +0000 UTC m=+4746.750112216" watchObservedRunningTime="2026-04-23 19:00:28.001970245 +0000 UTC m=+4746.751412045" Apr 23 19:00:31.875523 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:31.875489 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:31.875929 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:31.875636 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:31.888150 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:31.888130 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:32.005248 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:32.005220 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:00:34.001839 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:34.001795 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" event={"ID":"de33bead-ac0e-480d-997e-c66803a4d14c","Type":"ContainerStarted","Data":"5a649412d608b3936f2dd52d02575b1362fbde6c271cf866c39f8bbde5410760"} Apr 23 19:00:54.726747 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:54.726624 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q"] Apr 23 19:00:54.727840 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:54.727023 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" podUID="a1d211b9-e371-4f1f-ad81-0759309624e1" containerName="main" containerID="cri-o://ccd830c291ad6b7f7001fd8183a0d89c04e9abca3eeeb9b92657f287c410ca21" gracePeriod=30 Apr 23 19:00:54.730468 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:00:54.730411 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s"] Apr 23 19:01:01.994897 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:01.994852 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" podUID="a1d211b9-e371-4f1f-ad81-0759309624e1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 23 19:01:05.695713 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.695655 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp"] Apr 23 19:01:05.753268 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.753233 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp"] Apr 23 19:01:05.753449 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.753407 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.757236 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.757210 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 23 19:01:05.873499 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.873460 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-home\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.873499 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.873513 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-model-cache\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.873770 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.873577 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/199e66b8-1c1c-4703-899e-7301b1d285d4-tls-certs\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.873770 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.873634 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-tmp-dir\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.873770 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.873714 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcc2b\" (UniqueName: \"kubernetes.io/projected/199e66b8-1c1c-4703-899e-7301b1d285d4-kube-api-access-fcc2b\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.873934 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.873777 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-dshm\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.873934 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.873802 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.974912 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.974822 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcc2b\" (UniqueName: \"kubernetes.io/projected/199e66b8-1c1c-4703-899e-7301b1d285d4-kube-api-access-fcc2b\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.974912 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.974890 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-dshm\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.975156 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.974923 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.975156 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.974968 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-home\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.975156 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.975026 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-model-cache\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.975156 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.975050 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/199e66b8-1c1c-4703-899e-7301b1d285d4-tls-certs\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.975156 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.975089 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-tmp-dir\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.975467 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.975441 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.975467 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.975462 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-tmp-dir\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.975723 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.975697 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-model-cache\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.975810 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.975717 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-home\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.977851 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.977822 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-dshm\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.978213 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.978178 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/199e66b8-1c1c-4703-899e-7301b1d285d4-tls-certs\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:05.984760 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:05.984732 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcc2b\" (UniqueName: \"kubernetes.io/projected/199e66b8-1c1c-4703-899e-7301b1d285d4-kube-api-access-fcc2b\") pod \"precise-prefix-cache-test-kserve-65747bccd4-b5kpp\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:06.066216 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:06.066170 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:08.122415 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.122388 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:01:08.126498 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.126449 2581 generic.go:358] "Generic (PLEG): container finished" podID="a1d211b9-e371-4f1f-ad81-0759309624e1" containerID="ccd830c291ad6b7f7001fd8183a0d89c04e9abca3eeeb9b92657f287c410ca21" exitCode=0 Apr 23 19:01:08.126625 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.126551 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" event={"ID":"a1d211b9-e371-4f1f-ad81-0759309624e1","Type":"ContainerDied","Data":"ccd830c291ad6b7f7001fd8183a0d89c04e9abca3eeeb9b92657f287c410ca21"} Apr 23 19:01:08.126625 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.126583 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" event={"ID":"a1d211b9-e371-4f1f-ad81-0759309624e1","Type":"ContainerDied","Data":"a9ec032ccb2db40655b2bc615ce8dac14a9a95982976912fd6a81f33ce2ed56d"} Apr 23 19:01:08.126625 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.126596 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q" Apr 23 19:01:08.126742 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.126604 2581 scope.go:117] "RemoveContainer" containerID="ccd830c291ad6b7f7001fd8183a0d89c04e9abca3eeeb9b92657f287c410ca21" Apr 23 19:01:08.195947 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.195379 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-tmp-dir\") pod \"a1d211b9-e371-4f1f-ad81-0759309624e1\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " Apr 23 19:01:08.195947 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.195437 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-kserve-provision-location\") pod \"a1d211b9-e371-4f1f-ad81-0759309624e1\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " Apr 23 19:01:08.195947 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.195453 2581 scope.go:117] "RemoveContainer" containerID="4d020937c81b1d5dc7b7b0ad9a4e3acb25025cc2d5be89ba095d41d1f5ab8558" Apr 23 19:01:08.195947 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.195466 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-model-cache\") pod \"a1d211b9-e371-4f1f-ad81-0759309624e1\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " Apr 23 19:01:08.195947 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.195474 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "a1d211b9-e371-4f1f-ad81-0759309624e1" (UID: "a1d211b9-e371-4f1f-ad81-0759309624e1"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:08.195947 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.195499 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ppvw\" (UniqueName: \"kubernetes.io/projected/a1d211b9-e371-4f1f-ad81-0759309624e1-kube-api-access-5ppvw\") pod \"a1d211b9-e371-4f1f-ad81-0759309624e1\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " Apr 23 19:01:08.195947 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.195539 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a1d211b9-e371-4f1f-ad81-0759309624e1-tls-certs\") pod \"a1d211b9-e371-4f1f-ad81-0759309624e1\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " Apr 23 19:01:08.195947 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.195566 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-dshm\") pod \"a1d211b9-e371-4f1f-ad81-0759309624e1\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " Apr 23 19:01:08.195947 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.195641 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-home\") pod \"a1d211b9-e371-4f1f-ad81-0759309624e1\" (UID: \"a1d211b9-e371-4f1f-ad81-0759309624e1\") " Apr 23 19:01:08.195947 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.195739 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-model-cache" (OuterVolumeSpecName: "model-cache") pod "a1d211b9-e371-4f1f-ad81-0759309624e1" (UID: "a1d211b9-e371-4f1f-ad81-0759309624e1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:08.195947 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.195916 2581 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-tmp-dir\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:08.195947 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.195936 2581 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-model-cache\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:08.196616 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.196139 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-home" (OuterVolumeSpecName: "home") pod "a1d211b9-e371-4f1f-ad81-0759309624e1" (UID: "a1d211b9-e371-4f1f-ad81-0759309624e1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:08.200417 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.200377 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-dshm" (OuterVolumeSpecName: "dshm") pod "a1d211b9-e371-4f1f-ad81-0759309624e1" (UID: "a1d211b9-e371-4f1f-ad81-0759309624e1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:08.200648 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.200616 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d211b9-e371-4f1f-ad81-0759309624e1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a1d211b9-e371-4f1f-ad81-0759309624e1" (UID: "a1d211b9-e371-4f1f-ad81-0759309624e1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:01:08.203607 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.203572 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d211b9-e371-4f1f-ad81-0759309624e1-kube-api-access-5ppvw" (OuterVolumeSpecName: "kube-api-access-5ppvw") pod "a1d211b9-e371-4f1f-ad81-0759309624e1" (UID: "a1d211b9-e371-4f1f-ad81-0759309624e1"). InnerVolumeSpecName "kube-api-access-5ppvw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:01:08.267344 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.267283 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a1d211b9-e371-4f1f-ad81-0759309624e1" (UID: "a1d211b9-e371-4f1f-ad81-0759309624e1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:08.283630 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.283600 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp"] Apr 23 19:01:08.296672 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.296644 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a1d211b9-e371-4f1f-ad81-0759309624e1-tls-certs\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:08.296672 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.296672 2581 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-dshm\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:08.296839 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.296684 2581 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-home\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:08.296839 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.296698 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1d211b9-e371-4f1f-ad81-0759309624e1-kserve-provision-location\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:08.296839 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.296712 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5ppvw\" (UniqueName: \"kubernetes.io/projected/a1d211b9-e371-4f1f-ad81-0759309624e1-kube-api-access-5ppvw\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:08.347101 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.347077 2581 scope.go:117] "RemoveContainer" containerID="ccd830c291ad6b7f7001fd8183a0d89c04e9abca3eeeb9b92657f287c410ca21" Apr 23 19:01:08.347447 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:01:08.347419 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccd830c291ad6b7f7001fd8183a0d89c04e9abca3eeeb9b92657f287c410ca21\": container with ID starting with ccd830c291ad6b7f7001fd8183a0d89c04e9abca3eeeb9b92657f287c410ca21 not found: ID does not exist" containerID="ccd830c291ad6b7f7001fd8183a0d89c04e9abca3eeeb9b92657f287c410ca21" Apr 23 19:01:08.347545 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.347460 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccd830c291ad6b7f7001fd8183a0d89c04e9abca3eeeb9b92657f287c410ca21"} err="failed to get container status \"ccd830c291ad6b7f7001fd8183a0d89c04e9abca3eeeb9b92657f287c410ca21\": rpc error: code = NotFound desc = could not find container \"ccd830c291ad6b7f7001fd8183a0d89c04e9abca3eeeb9b92657f287c410ca21\": container with ID starting with ccd830c291ad6b7f7001fd8183a0d89c04e9abca3eeeb9b92657f287c410ca21 not found: ID does not exist" Apr 23 19:01:08.347545 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.347482 2581 scope.go:117] "RemoveContainer" containerID="4d020937c81b1d5dc7b7b0ad9a4e3acb25025cc2d5be89ba095d41d1f5ab8558" Apr 23 19:01:08.347760 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:01:08.347741 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d020937c81b1d5dc7b7b0ad9a4e3acb25025cc2d5be89ba095d41d1f5ab8558\": container with ID starting with 4d020937c81b1d5dc7b7b0ad9a4e3acb25025cc2d5be89ba095d41d1f5ab8558 not found: ID does not exist" containerID="4d020937c81b1d5dc7b7b0ad9a4e3acb25025cc2d5be89ba095d41d1f5ab8558" Apr 23 19:01:08.347823 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.347767 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d020937c81b1d5dc7b7b0ad9a4e3acb25025cc2d5be89ba095d41d1f5ab8558"} err="failed to get container status \"4d020937c81b1d5dc7b7b0ad9a4e3acb25025cc2d5be89ba095d41d1f5ab8558\": rpc error: code = NotFound desc = could not find container \"4d020937c81b1d5dc7b7b0ad9a4e3acb25025cc2d5be89ba095d41d1f5ab8558\": container with ID starting with 4d020937c81b1d5dc7b7b0ad9a4e3acb25025cc2d5be89ba095d41d1f5ab8558 not found: ID does not exist" Apr 23 19:01:08.349020 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:01:08.348973 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199e66b8_1c1c_4703_899e_7301b1d285d4.slice/crio-def4b6e121159f67396fa192ec3bb889b79b6a9cdfd2f5c2abb5d7d5488716c6 WatchSource:0}: Error finding container def4b6e121159f67396fa192ec3bb889b79b6a9cdfd2f5c2abb5d7d5488716c6: Status 404 returned error can't find the container with id def4b6e121159f67396fa192ec3bb889b79b6a9cdfd2f5c2abb5d7d5488716c6 Apr 23 19:01:08.465120 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.465088 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q"] Apr 23 19:01:08.468184 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:08.468162 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-554847cfc4-8zh9q"] Apr 23 19:01:09.132775 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:09.132688 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" event={"ID":"de33bead-ac0e-480d-997e-c66803a4d14c","Type":"ContainerStarted","Data":"2963f869ab1a07117f425f61d209d2a235967cb07e93b95197e403c6a2c5e26f"} Apr 23 19:01:09.133306 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:09.132820 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="main" containerID="cri-o://5a649412d608b3936f2dd52d02575b1362fbde6c271cf866c39f8bbde5410760" gracePeriod=30 Apr 23 19:01:09.133306 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:09.132915 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="tokenizer" containerID="cri-o://2963f869ab1a07117f425f61d209d2a235967cb07e93b95197e403c6a2c5e26f" gracePeriod=30 Apr 23 19:01:09.133306 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:09.132938 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:01:09.135963 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:09.135933 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 23 19:01:09.136243 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:09.136222 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" event={"ID":"199e66b8-1c1c-4703-899e-7301b1d285d4","Type":"ContainerStarted","Data":"e29ec02af48f5d5f5b875924136298545d5d770aac0921d0163f866bf0d5e4b1"} Apr 23 19:01:09.136338 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:09.136250 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" event={"ID":"199e66b8-1c1c-4703-899e-7301b1d285d4","Type":"ContainerStarted","Data":"def4b6e121159f67396fa192ec3bb889b79b6a9cdfd2f5c2abb5d7d5488716c6"} Apr 23 19:01:09.160635 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:09.160572 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" podStartSLOduration=3.477767757 podStartE2EDuration="48.160551227s" podCreationTimestamp="2026-04-23 19:00:21 +0000 UTC" firstStartedPulling="2026-04-23 19:00:23.964966551 +0000 UTC m=+4742.714408330" lastFinishedPulling="2026-04-23 19:01:08.647750017 +0000 UTC m=+4787.397191800" observedRunningTime="2026-04-23 19:01:09.157404648 +0000 UTC m=+4787.906846462" watchObservedRunningTime="2026-04-23 19:01:09.160551227 +0000 UTC m=+4787.909993029" Apr 23 19:01:09.792463 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:09.792416 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1d211b9-e371-4f1f-ad81-0759309624e1" path="/var/lib/kubelet/pods/a1d211b9-e371-4f1f-ad81-0759309624e1/volumes" Apr 23 19:01:10.142292 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:10.142264 2581 generic.go:358] "Generic (PLEG): container finished" podID="de33bead-ac0e-480d-997e-c66803a4d14c" containerID="5a649412d608b3936f2dd52d02575b1362fbde6c271cf866c39f8bbde5410760" exitCode=0 Apr 23 19:01:10.142661 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:10.142348 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" event={"ID":"de33bead-ac0e-480d-997e-c66803a4d14c","Type":"ContainerDied","Data":"5a649412d608b3936f2dd52d02575b1362fbde6c271cf866c39f8bbde5410760"} Apr 23 19:01:12.151568 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:12.151535 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:01:13.165407 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:13.165366 2581 generic.go:358] "Generic (PLEG): container finished" podID="199e66b8-1c1c-4703-899e-7301b1d285d4" containerID="e29ec02af48f5d5f5b875924136298545d5d770aac0921d0163f866bf0d5e4b1" exitCode=0 Apr 23 19:01:13.165913 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:13.165434 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" event={"ID":"199e66b8-1c1c-4703-899e-7301b1d285d4","Type":"ContainerDied","Data":"e29ec02af48f5d5f5b875924136298545d5d770aac0921d0163f866bf0d5e4b1"} Apr 23 19:01:14.170973 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:14.170934 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" event={"ID":"199e66b8-1c1c-4703-899e-7301b1d285d4","Type":"ContainerStarted","Data":"7518040a5ea26b1d1323df5d8f3a4672f05dd1e9187981b76d716d232c7fd3c2"} Apr 23 19:01:14.194332 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:14.194284 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" podStartSLOduration=9.194267975 podStartE2EDuration="9.194267975s" podCreationTimestamp="2026-04-23 19:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:01:14.192907421 +0000 UTC m=+4792.942349221" watchObservedRunningTime="2026-04-23 19:01:14.194267975 +0000 UTC m=+4792.943709772" Apr 23 19:01:16.066549 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:16.066503 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:16.066549 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:16.066560 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:16.079228 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:16.079201 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:16.188666 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:16.188638 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:19.134175 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:01:19.134144 2581 logging.go:55] [core] [Channel #20 SubChannel #21]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.31:9003", ServerName: "10.132.0.31:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.31:9003: connect: connection refused" Apr 23 19:01:20.134178 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:20.134132 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.31:9003\" within 1s: context deadline exceeded" Apr 23 19:01:29.133637 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:01:29.133606 2581 logging.go:55] [core] [Channel #22 SubChannel #23]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.31:9003", ServerName: "10.132.0.31:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.31:9003: connect: connection refused" Apr 23 19:01:30.133544 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:30.133498 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.31:9003\" within 1s: context deadline exceeded" Apr 23 19:01:38.306646 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:38.306610 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp"] Apr 23 19:01:38.307153 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:38.306978 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" podUID="199e66b8-1c1c-4703-899e-7301b1d285d4" containerName="main" containerID="cri-o://7518040a5ea26b1d1323df5d8f3a4672f05dd1e9187981b76d716d232c7fd3c2" gracePeriod=30 Apr 23 19:01:39.076068 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.076044 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:39.133825 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:01:39.133795 2581 logging.go:55] [core] [Channel #24 SubChannel #25]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.31:9003", ServerName: "10.132.0.31:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.31:9003: connect: connection refused" Apr 23 19:01:39.167609 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.167488 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-home\") pod \"199e66b8-1c1c-4703-899e-7301b1d285d4\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " Apr 23 19:01:39.167784 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.167632 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcc2b\" (UniqueName: \"kubernetes.io/projected/199e66b8-1c1c-4703-899e-7301b1d285d4-kube-api-access-fcc2b\") pod \"199e66b8-1c1c-4703-899e-7301b1d285d4\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " Apr 23 19:01:39.167881 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.167736 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-dshm\") pod \"199e66b8-1c1c-4703-899e-7301b1d285d4\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " Apr 23 19:01:39.167974 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.167922 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-kserve-provision-location\") pod \"199e66b8-1c1c-4703-899e-7301b1d285d4\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " Apr 23 19:01:39.168063 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.168023 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-tmp-dir\") pod \"199e66b8-1c1c-4703-899e-7301b1d285d4\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " Apr 23 19:01:39.168117 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.168088 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/199e66b8-1c1c-4703-899e-7301b1d285d4-tls-certs\") pod \"199e66b8-1c1c-4703-899e-7301b1d285d4\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " Apr 23 19:01:39.168193 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.168174 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-model-cache\") pod \"199e66b8-1c1c-4703-899e-7301b1d285d4\" (UID: \"199e66b8-1c1c-4703-899e-7301b1d285d4\") " Apr 23 19:01:39.169670 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.169620 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-model-cache" (OuterVolumeSpecName: "model-cache") pod "199e66b8-1c1c-4703-899e-7301b1d285d4" (UID: "199e66b8-1c1c-4703-899e-7301b1d285d4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:39.169874 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.169845 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-home" (OuterVolumeSpecName: "home") pod "199e66b8-1c1c-4703-899e-7301b1d285d4" (UID: "199e66b8-1c1c-4703-899e-7301b1d285d4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:39.170731 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.170662 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "199e66b8-1c1c-4703-899e-7301b1d285d4" (UID: "199e66b8-1c1c-4703-899e-7301b1d285d4"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:39.173656 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.173618 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199e66b8-1c1c-4703-899e-7301b1d285d4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "199e66b8-1c1c-4703-899e-7301b1d285d4" (UID: "199e66b8-1c1c-4703-899e-7301b1d285d4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:01:39.173931 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.173907 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-dshm" (OuterVolumeSpecName: "dshm") pod "199e66b8-1c1c-4703-899e-7301b1d285d4" (UID: "199e66b8-1c1c-4703-899e-7301b1d285d4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:39.174336 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.174316 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199e66b8-1c1c-4703-899e-7301b1d285d4-kube-api-access-fcc2b" (OuterVolumeSpecName: "kube-api-access-fcc2b") pod "199e66b8-1c1c-4703-899e-7301b1d285d4" (UID: "199e66b8-1c1c-4703-899e-7301b1d285d4"). InnerVolumeSpecName "kube-api-access-fcc2b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:01:39.232994 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.232942 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "199e66b8-1c1c-4703-899e-7301b1d285d4" (UID: "199e66b8-1c1c-4703-899e-7301b1d285d4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:39.252855 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.252829 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s_de33bead-ac0e-480d-997e-c66803a4d14c/tokenizer/0.log" Apr 23 19:01:39.253454 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.253429 2581 generic.go:358] "Generic (PLEG): container finished" podID="de33bead-ac0e-480d-997e-c66803a4d14c" containerID="2963f869ab1a07117f425f61d209d2a235967cb07e93b95197e403c6a2c5e26f" exitCode=137 Apr 23 19:01:39.253544 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.253491 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" event={"ID":"de33bead-ac0e-480d-997e-c66803a4d14c","Type":"ContainerDied","Data":"2963f869ab1a07117f425f61d209d2a235967cb07e93b95197e403c6a2c5e26f"} Apr 23 19:01:39.254919 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.254885 2581 generic.go:358] "Generic (PLEG): container finished" podID="199e66b8-1c1c-4703-899e-7301b1d285d4" containerID="7518040a5ea26b1d1323df5d8f3a4672f05dd1e9187981b76d716d232c7fd3c2" exitCode=0 Apr 23 19:01:39.255031 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.254922 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" event={"ID":"199e66b8-1c1c-4703-899e-7301b1d285d4","Type":"ContainerDied","Data":"7518040a5ea26b1d1323df5d8f3a4672f05dd1e9187981b76d716d232c7fd3c2"} Apr 23 19:01:39.255031 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.254944 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" event={"ID":"199e66b8-1c1c-4703-899e-7301b1d285d4","Type":"ContainerDied","Data":"def4b6e121159f67396fa192ec3bb889b79b6a9cdfd2f5c2abb5d7d5488716c6"} Apr 23 19:01:39.255031 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.254964 2581 scope.go:117] "RemoveContainer" containerID="7518040a5ea26b1d1323df5d8f3a4672f05dd1e9187981b76d716d232c7fd3c2" Apr 23 19:01:39.255143 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.255036 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp" Apr 23 19:01:39.265420 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.265401 2581 scope.go:117] "RemoveContainer" containerID="e29ec02af48f5d5f5b875924136298545d5d770aac0921d0163f866bf0d5e4b1" Apr 23 19:01:39.270554 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.270532 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-kserve-provision-location\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:39.270554 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.270553 2581 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-tmp-dir\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:39.270710 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.270563 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/199e66b8-1c1c-4703-899e-7301b1d285d4-tls-certs\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:39.270710 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.270572 2581 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-model-cache\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:39.270710 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.270581 2581 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-home\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:39.270710 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.270589 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fcc2b\" (UniqueName: \"kubernetes.io/projected/199e66b8-1c1c-4703-899e-7301b1d285d4-kube-api-access-fcc2b\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:39.270710 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.270596 2581 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/199e66b8-1c1c-4703-899e-7301b1d285d4-dshm\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:39.277080 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.277062 2581 scope.go:117] "RemoveContainer" containerID="7518040a5ea26b1d1323df5d8f3a4672f05dd1e9187981b76d716d232c7fd3c2" Apr 23 19:01:39.277302 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:01:39.277285 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7518040a5ea26b1d1323df5d8f3a4672f05dd1e9187981b76d716d232c7fd3c2\": container with ID starting with 7518040a5ea26b1d1323df5d8f3a4672f05dd1e9187981b76d716d232c7fd3c2 not found: ID does not exist" containerID="7518040a5ea26b1d1323df5d8f3a4672f05dd1e9187981b76d716d232c7fd3c2" Apr 23 19:01:39.277354 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.277309 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7518040a5ea26b1d1323df5d8f3a4672f05dd1e9187981b76d716d232c7fd3c2"} err="failed to get container status \"7518040a5ea26b1d1323df5d8f3a4672f05dd1e9187981b76d716d232c7fd3c2\": rpc error: code = NotFound desc = could not find container \"7518040a5ea26b1d1323df5d8f3a4672f05dd1e9187981b76d716d232c7fd3c2\": container with ID starting with 7518040a5ea26b1d1323df5d8f3a4672f05dd1e9187981b76d716d232c7fd3c2 not found: ID does not exist" Apr 23 19:01:39.277354 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.277325 2581 scope.go:117] "RemoveContainer" containerID="e29ec02af48f5d5f5b875924136298545d5d770aac0921d0163f866bf0d5e4b1" Apr 23 19:01:39.277554 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:01:39.277526 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e29ec02af48f5d5f5b875924136298545d5d770aac0921d0163f866bf0d5e4b1\": container with ID starting with e29ec02af48f5d5f5b875924136298545d5d770aac0921d0163f866bf0d5e4b1 not found: ID does not exist" containerID="e29ec02af48f5d5f5b875924136298545d5d770aac0921d0163f866bf0d5e4b1" Apr 23 19:01:39.277612 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.277559 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e29ec02af48f5d5f5b875924136298545d5d770aac0921d0163f866bf0d5e4b1"} err="failed to get container status \"e29ec02af48f5d5f5b875924136298545d5d770aac0921d0163f866bf0d5e4b1\": rpc error: code = NotFound desc = could not find container \"e29ec02af48f5d5f5b875924136298545d5d770aac0921d0163f866bf0d5e4b1\": container with ID starting with e29ec02af48f5d5f5b875924136298545d5d770aac0921d0163f866bf0d5e4b1 not found: ID does not exist" Apr 23 19:01:39.281571 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.281549 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp"] Apr 23 19:01:39.286084 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.286057 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65747bccd4-b5kpp"] Apr 23 19:01:39.785326 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.785285 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s_de33bead-ac0e-480d-997e-c66803a4d14c/tokenizer/0.log" Apr 23 19:01:39.786203 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.786172 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:01:39.790303 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.790283 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="199e66b8-1c1c-4703-899e-7301b1d285d4" path="/var/lib/kubelet/pods/199e66b8-1c1c-4703-899e-7301b1d285d4/volumes" Apr 23 19:01:39.876918 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.876888 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-tmp\") pod \"de33bead-ac0e-480d-997e-c66803a4d14c\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " Apr 23 19:01:39.877134 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.876967 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-cache\") pod \"de33bead-ac0e-480d-997e-c66803a4d14c\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " Apr 23 19:01:39.877134 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.877030 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de33bead-ac0e-480d-997e-c66803a4d14c-tls-certs\") pod \"de33bead-ac0e-480d-997e-c66803a4d14c\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " Apr 23 19:01:39.877134 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.877077 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-uds\") pod \"de33bead-ac0e-480d-997e-c66803a4d14c\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " Apr 23 19:01:39.877134 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.877108 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmss7\" (UniqueName: \"kubernetes.io/projected/de33bead-ac0e-480d-997e-c66803a4d14c-kube-api-access-gmss7\") pod \"de33bead-ac0e-480d-997e-c66803a4d14c\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " Apr 23 19:01:39.877338 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.877147 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-kserve-provision-location\") pod \"de33bead-ac0e-480d-997e-c66803a4d14c\" (UID: \"de33bead-ac0e-480d-997e-c66803a4d14c\") " Apr 23 19:01:39.877338 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.877213 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "de33bead-ac0e-480d-997e-c66803a4d14c" (UID: "de33bead-ac0e-480d-997e-c66803a4d14c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:39.877338 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.877314 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "de33bead-ac0e-480d-997e-c66803a4d14c" (UID: "de33bead-ac0e-480d-997e-c66803a4d14c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:39.877480 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.877369 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "de33bead-ac0e-480d-997e-c66803a4d14c" (UID: "de33bead-ac0e-480d-997e-c66803a4d14c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:39.877536 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.877484 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-uds\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:39.877536 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.877504 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-tmp\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:39.877536 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.877517 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-tokenizer-cache\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:39.877885 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.877861 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "de33bead-ac0e-480d-997e-c66803a4d14c" (UID: "de33bead-ac0e-480d-997e-c66803a4d14c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:39.879301 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.879283 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de33bead-ac0e-480d-997e-c66803a4d14c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "de33bead-ac0e-480d-997e-c66803a4d14c" (UID: "de33bead-ac0e-480d-997e-c66803a4d14c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:01:39.879416 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.879397 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de33bead-ac0e-480d-997e-c66803a4d14c-kube-api-access-gmss7" (OuterVolumeSpecName: "kube-api-access-gmss7") pod "de33bead-ac0e-480d-997e-c66803a4d14c" (UID: "de33bead-ac0e-480d-997e-c66803a4d14c"). InnerVolumeSpecName "kube-api-access-gmss7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:01:39.978229 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.978146 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gmss7\" (UniqueName: \"kubernetes.io/projected/de33bead-ac0e-480d-997e-c66803a4d14c-kube-api-access-gmss7\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:39.978229 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.978176 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de33bead-ac0e-480d-997e-c66803a4d14c-kserve-provision-location\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:39.978229 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:39.978192 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de33bead-ac0e-480d-997e-c66803a4d14c-tls-certs\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:01:40.133944 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:40.133901 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.31:9003\" within 1s: context deadline exceeded" Apr 23 19:01:40.260414 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:40.260340 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s_de33bead-ac0e-480d-997e-c66803a4d14c/tokenizer/0.log" Apr 23 19:01:40.261005 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:40.260960 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" event={"ID":"de33bead-ac0e-480d-997e-c66803a4d14c","Type":"ContainerDied","Data":"488f1af3d02378a197dafad737dac6ef7b181f79cbd5f10efe3fd2c33dea698c"} Apr 23 19:01:40.261117 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:40.261024 2581 scope.go:117] "RemoveContainer" containerID="2963f869ab1a07117f425f61d209d2a235967cb07e93b95197e403c6a2c5e26f" Apr 23 19:01:40.261117 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:40.261029 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s" Apr 23 19:01:40.269385 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:40.269320 2581 scope.go:117] "RemoveContainer" containerID="5a649412d608b3936f2dd52d02575b1362fbde6c271cf866c39f8bbde5410760" Apr 23 19:01:40.276529 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:40.276514 2581 scope.go:117] "RemoveContainer" containerID="6033134c3157771e566bdfb66493d297ede9a35b3b8b88895f8c014dffc20855" Apr 23 19:01:40.287329 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:40.287307 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s"] Apr 23 19:01:40.293946 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:40.293927 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59lqc2s"] Apr 23 19:01:41.790235 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:41.790201 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" path="/var/lib/kubelet/pods/de33bead-ac0e-480d-997e-c66803a4d14c/volumes" Apr 23 19:01:49.931327 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931294 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk"] Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931608 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="storage-initializer" Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931618 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="storage-initializer" Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931630 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="199e66b8-1c1c-4703-899e-7301b1d285d4" containerName="storage-initializer" Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931638 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="199e66b8-1c1c-4703-899e-7301b1d285d4" containerName="storage-initializer" Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931652 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="main" Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931657 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="main" Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931668 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1d211b9-e371-4f1f-ad81-0759309624e1" containerName="main" Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931673 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d211b9-e371-4f1f-ad81-0759309624e1" containerName="main" Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931679 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1d211b9-e371-4f1f-ad81-0759309624e1" containerName="storage-initializer" Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931684 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d211b9-e371-4f1f-ad81-0759309624e1" containerName="storage-initializer" Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931690 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="tokenizer" Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931695 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="tokenizer" Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931700 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="199e66b8-1c1c-4703-899e-7301b1d285d4" containerName="main" Apr 23 19:01:49.931740 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931707 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="199e66b8-1c1c-4703-899e-7301b1d285d4" containerName="main" Apr 23 19:01:49.932188 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931754 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1d211b9-e371-4f1f-ad81-0759309624e1" containerName="main" Apr 23 19:01:49.932188 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931761 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="main" Apr 23 19:01:49.932188 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931768 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="de33bead-ac0e-480d-997e-c66803a4d14c" containerName="tokenizer" Apr 23 19:01:49.932188 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.931775 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="199e66b8-1c1c-4703-899e-7301b1d285d4" containerName="main" Apr 23 19:01:49.935725 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.935708 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:49.938910 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.938882 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 19:01:49.939051 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.938932 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-9dwsf\"" Apr 23 19:01:49.940198 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.940178 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 23 19:01:49.940310 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.940199 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 19:01:49.940310 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.940209 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-78dl4\"" Apr 23 19:01:49.945538 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:49.945517 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk"] Apr 23 19:01:50.063849 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.063812 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.064044 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.063867 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.064044 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.063928 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.064044 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.063972 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkbtf\" (UniqueName: \"kubernetes.io/projected/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-kube-api-access-mkbtf\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.064044 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.064028 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.064196 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.064057 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.165330 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.165290 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.165505 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.165355 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.165505 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.165389 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.165505 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.165413 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkbtf\" (UniqueName: \"kubernetes.io/projected/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-kube-api-access-mkbtf\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.165505 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.165441 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.165505 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.165466 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.165918 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.165709 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.165918 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.165836 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.165918 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.165890 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.166071 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.165931 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.167977 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.167955 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.175206 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.175184 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkbtf\" (UniqueName: \"kubernetes.io/projected/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-kube-api-access-mkbtf\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-fs6kk\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.246218 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.246134 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:50.375879 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:50.375853 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk"] Apr 23 19:01:50.378444 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:01:50.378415 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc00348f8_d733_4d03_8fa1_5bc16e1a2cee.slice/crio-b728ba3ecabe3b65710411dcfb5ed2db3896398c353e372d2f68b9f4a33a6494 WatchSource:0}: Error finding container b728ba3ecabe3b65710411dcfb5ed2db3896398c353e372d2f68b9f4a33a6494: Status 404 returned error can't find the container with id b728ba3ecabe3b65710411dcfb5ed2db3896398c353e372d2f68b9f4a33a6494 Apr 23 19:01:51.300680 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:51.300642 2581 generic.go:358] "Generic (PLEG): container finished" podID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" containerID="d6b295218740f60e12caca64a777559a6a4f1e774f797f0432dd675f8d31383e" exitCode=0 Apr 23 19:01:51.301081 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:51.300738 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" event={"ID":"c00348f8-d733-4d03-8fa1-5bc16e1a2cee","Type":"ContainerDied","Data":"d6b295218740f60e12caca64a777559a6a4f1e774f797f0432dd675f8d31383e"} Apr 23 19:01:51.301081 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:51.300779 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" event={"ID":"c00348f8-d733-4d03-8fa1-5bc16e1a2cee","Type":"ContainerStarted","Data":"b728ba3ecabe3b65710411dcfb5ed2db3896398c353e372d2f68b9f4a33a6494"} Apr 23 19:01:52.306127 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:52.306091 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" event={"ID":"c00348f8-d733-4d03-8fa1-5bc16e1a2cee","Type":"ContainerStarted","Data":"5b74a6ac4a96c90644e6d57779884831a213d731258064eabe29c898be36aedc"} Apr 23 19:01:52.306127 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:52.306125 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" event={"ID":"c00348f8-d733-4d03-8fa1-5bc16e1a2cee","Type":"ContainerStarted","Data":"0c7795bfbcadb90dcf096ef7f95f29962cd76b7a25a0fccbec186ceef3d5ec5a"} Apr 23 19:01:52.306584 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:52.306247 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:01:52.330528 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:01:52.330486 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" podStartSLOduration=3.330472536 podStartE2EDuration="3.330472536s" podCreationTimestamp="2026-04-23 19:01:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:01:52.329439537 +0000 UTC m=+4831.078881349" watchObservedRunningTime="2026-04-23 19:01:52.330472536 +0000 UTC m=+4831.079914336" Apr 23 19:02:00.246753 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:02:00.246717 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:02:00.247206 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:02:00.246887 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:02:00.249490 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:02:00.249464 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:02:00.340114 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:02:00.340088 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:02:22.347078 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:02:22.346978 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:04:12.806284 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:12.806245 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk"] Apr 23 19:04:12.806760 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:12.806600 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" podUID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" containerName="main" containerID="cri-o://0c7795bfbcadb90dcf096ef7f95f29962cd76b7a25a0fccbec186ceef3d5ec5a" gracePeriod=30 Apr 23 19:04:12.806760 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:12.806634 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" podUID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" containerName="tokenizer" containerID="cri-o://5b74a6ac4a96c90644e6d57779884831a213d731258064eabe29c898be36aedc" gracePeriod=30 Apr 23 19:04:13.770475 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:13.770439 2581 generic.go:358] "Generic (PLEG): container finished" podID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" containerID="0c7795bfbcadb90dcf096ef7f95f29962cd76b7a25a0fccbec186ceef3d5ec5a" exitCode=0 Apr 23 19:04:13.770679 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:13.770508 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" event={"ID":"c00348f8-d733-4d03-8fa1-5bc16e1a2cee","Type":"ContainerDied","Data":"0c7795bfbcadb90dcf096ef7f95f29962cd76b7a25a0fccbec186ceef3d5ec5a"} Apr 23 19:04:14.055945 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.055914 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:04:14.194188 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.194147 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkbtf\" (UniqueName: \"kubernetes.io/projected/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-kube-api-access-mkbtf\") pod \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " Apr 23 19:04:14.194356 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.194199 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tls-certs\") pod \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " Apr 23 19:04:14.194356 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.194226 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-cache\") pod \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " Apr 23 19:04:14.194356 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.194248 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-uds\") pod \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " Apr 23 19:04:14.194356 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.194290 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-tmp\") pod \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " Apr 23 19:04:14.194356 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.194308 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-kserve-provision-location\") pod \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\" (UID: \"c00348f8-d733-4d03-8fa1-5bc16e1a2cee\") " Apr 23 19:04:14.194838 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.194546 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c00348f8-d733-4d03-8fa1-5bc16e1a2cee" (UID: "c00348f8-d733-4d03-8fa1-5bc16e1a2cee"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:04:14.194838 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.194579 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c00348f8-d733-4d03-8fa1-5bc16e1a2cee" (UID: "c00348f8-d733-4d03-8fa1-5bc16e1a2cee"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:04:14.194838 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.194686 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c00348f8-d733-4d03-8fa1-5bc16e1a2cee" (UID: "c00348f8-d733-4d03-8fa1-5bc16e1a2cee"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:04:14.195203 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.195175 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c00348f8-d733-4d03-8fa1-5bc16e1a2cee" (UID: "c00348f8-d733-4d03-8fa1-5bc16e1a2cee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:04:14.196485 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.196464 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c00348f8-d733-4d03-8fa1-5bc16e1a2cee" (UID: "c00348f8-d733-4d03-8fa1-5bc16e1a2cee"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:04:14.196570 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.196488 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-kube-api-access-mkbtf" (OuterVolumeSpecName: "kube-api-access-mkbtf") pod "c00348f8-d733-4d03-8fa1-5bc16e1a2cee" (UID: "c00348f8-d733-4d03-8fa1-5bc16e1a2cee"). InnerVolumeSpecName "kube-api-access-mkbtf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:04:14.295633 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.295558 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-tmp\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:04:14.295633 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.295590 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-kserve-provision-location\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:04:14.295633 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.295601 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mkbtf\" (UniqueName: \"kubernetes.io/projected/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-kube-api-access-mkbtf\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:04:14.295633 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.295610 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tls-certs\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:04:14.295633 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.295619 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-cache\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:04:14.295633 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.295627 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c00348f8-d733-4d03-8fa1-5bc16e1a2cee-tokenizer-uds\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:04:14.775528 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.775492 2581 generic.go:358] "Generic (PLEG): container finished" podID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" containerID="5b74a6ac4a96c90644e6d57779884831a213d731258064eabe29c898be36aedc" exitCode=0 Apr 23 19:04:14.775694 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.775558 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" event={"ID":"c00348f8-d733-4d03-8fa1-5bc16e1a2cee","Type":"ContainerDied","Data":"5b74a6ac4a96c90644e6d57779884831a213d731258064eabe29c898be36aedc"} Apr 23 19:04:14.775694 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.775576 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" Apr 23 19:04:14.775694 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.775606 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk" event={"ID":"c00348f8-d733-4d03-8fa1-5bc16e1a2cee","Type":"ContainerDied","Data":"b728ba3ecabe3b65710411dcfb5ed2db3896398c353e372d2f68b9f4a33a6494"} Apr 23 19:04:14.775694 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.775630 2581 scope.go:117] "RemoveContainer" containerID="5b74a6ac4a96c90644e6d57779884831a213d731258064eabe29c898be36aedc" Apr 23 19:04:14.784721 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.784704 2581 scope.go:117] "RemoveContainer" containerID="0c7795bfbcadb90dcf096ef7f95f29962cd76b7a25a0fccbec186ceef3d5ec5a" Apr 23 19:04:14.791794 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.791776 2581 scope.go:117] "RemoveContainer" containerID="d6b295218740f60e12caca64a777559a6a4f1e774f797f0432dd675f8d31383e" Apr 23 19:04:14.798853 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.798831 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk"] Apr 23 19:04:14.799073 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.798961 2581 scope.go:117] "RemoveContainer" containerID="5b74a6ac4a96c90644e6d57779884831a213d731258064eabe29c898be36aedc" Apr 23 19:04:14.799372 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:04:14.799351 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b74a6ac4a96c90644e6d57779884831a213d731258064eabe29c898be36aedc\": container with ID starting with 5b74a6ac4a96c90644e6d57779884831a213d731258064eabe29c898be36aedc not found: ID does not exist" containerID="5b74a6ac4a96c90644e6d57779884831a213d731258064eabe29c898be36aedc" Apr 23 19:04:14.799472 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.799386 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b74a6ac4a96c90644e6d57779884831a213d731258064eabe29c898be36aedc"} err="failed to get container status \"5b74a6ac4a96c90644e6d57779884831a213d731258064eabe29c898be36aedc\": rpc error: code = NotFound desc = could not find container \"5b74a6ac4a96c90644e6d57779884831a213d731258064eabe29c898be36aedc\": container with ID starting with 5b74a6ac4a96c90644e6d57779884831a213d731258064eabe29c898be36aedc not found: ID does not exist" Apr 23 19:04:14.799472 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.799410 2581 scope.go:117] "RemoveContainer" containerID="0c7795bfbcadb90dcf096ef7f95f29962cd76b7a25a0fccbec186ceef3d5ec5a" Apr 23 19:04:14.799700 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:04:14.799672 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c7795bfbcadb90dcf096ef7f95f29962cd76b7a25a0fccbec186ceef3d5ec5a\": container with ID starting with 0c7795bfbcadb90dcf096ef7f95f29962cd76b7a25a0fccbec186ceef3d5ec5a not found: ID does not exist" containerID="0c7795bfbcadb90dcf096ef7f95f29962cd76b7a25a0fccbec186ceef3d5ec5a" Apr 23 19:04:14.799745 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.799710 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c7795bfbcadb90dcf096ef7f95f29962cd76b7a25a0fccbec186ceef3d5ec5a"} err="failed to get container status \"0c7795bfbcadb90dcf096ef7f95f29962cd76b7a25a0fccbec186ceef3d5ec5a\": rpc error: code = NotFound desc = could not find container \"0c7795bfbcadb90dcf096ef7f95f29962cd76b7a25a0fccbec186ceef3d5ec5a\": container with ID starting with 0c7795bfbcadb90dcf096ef7f95f29962cd76b7a25a0fccbec186ceef3d5ec5a not found: ID does not exist" Apr 23 19:04:14.799781 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.799748 2581 scope.go:117] "RemoveContainer" containerID="d6b295218740f60e12caca64a777559a6a4f1e774f797f0432dd675f8d31383e" Apr 23 19:04:14.800020 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:04:14.799975 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b295218740f60e12caca64a777559a6a4f1e774f797f0432dd675f8d31383e\": container with ID starting with d6b295218740f60e12caca64a777559a6a4f1e774f797f0432dd675f8d31383e not found: ID does not exist" containerID="d6b295218740f60e12caca64a777559a6a4f1e774f797f0432dd675f8d31383e" Apr 23 19:04:14.800110 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.800031 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b295218740f60e12caca64a777559a6a4f1e774f797f0432dd675f8d31383e"} err="failed to get container status \"d6b295218740f60e12caca64a777559a6a4f1e774f797f0432dd675f8d31383e\": rpc error: code = NotFound desc = could not find container \"d6b295218740f60e12caca64a777559a6a4f1e774f797f0432dd675f8d31383e\": container with ID starting with d6b295218740f60e12caca64a777559a6a4f1e774f797f0432dd675f8d31383e not found: ID does not exist" Apr 23 19:04:14.802633 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:14.802614 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-fs6kk"] Apr 23 19:04:15.790073 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:15.790032 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" path="/var/lib/kubelet/pods/c00348f8-d733-4d03-8fa1-5bc16e1a2cee/volumes" Apr 23 19:04:46.392033 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.391997 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs"] Apr 23 19:04:46.392502 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.392350 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" containerName="main" Apr 23 19:04:46.392502 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.392362 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" containerName="main" Apr 23 19:04:46.392502 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.392377 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" containerName="storage-initializer" Apr 23 19:04:46.392502 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.392385 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" containerName="storage-initializer" Apr 23 19:04:46.392502 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.392400 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" containerName="tokenizer" Apr 23 19:04:46.392502 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.392406 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" containerName="tokenizer" Apr 23 19:04:46.392502 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.392459 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" containerName="tokenizer" Apr 23 19:04:46.392502 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.392468 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="c00348f8-d733-4d03-8fa1-5bc16e1a2cee" containerName="main" Apr 23 19:04:46.395703 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.395679 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.398819 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.398795 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 23 19:04:46.398943 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.398829 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-78dl4\"" Apr 23 19:04:46.400053 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.400031 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-tf66j\"" Apr 23 19:04:46.400142 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.400060 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 19:04:46.400142 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.400088 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 19:04:46.412617 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.412596 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs"] Apr 23 19:04:46.473819 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.473779 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.474005 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.473834 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.474005 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.473861 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.474152 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.474000 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.474152 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.474061 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.474152 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.474111 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trgn4\" (UniqueName: \"kubernetes.io/projected/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-kube-api-access-trgn4\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.575161 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.575121 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.575354 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.575188 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.575354 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.575234 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trgn4\" (UniqueName: \"kubernetes.io/projected/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-kube-api-access-trgn4\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.575354 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.575301 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.575354 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.575323 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.575354 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.575344 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.575661 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.575631 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.575728 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.575666 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.575728 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.575715 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.575812 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.575758 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.578069 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.578040 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.584649 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.584625 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trgn4\" (UniqueName: \"kubernetes.io/projected/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-kube-api-access-trgn4\") pod \"stop-feature-test-kserve-router-scheduler-d547597f-zmfgs\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.706664 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.706578 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:46.846198 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.846105 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs"] Apr 23 19:04:46.848926 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:04:46.848898 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod128bf23c_8ee4_49e3_90d8_8d30f3a3da3f.slice/crio-23b0e128402d9aa166ceaf81de7324c932905bec2f8ee6bc4912aacd69d84897 WatchSource:0}: Error finding container 23b0e128402d9aa166ceaf81de7324c932905bec2f8ee6bc4912aacd69d84897: Status 404 returned error can't find the container with id 23b0e128402d9aa166ceaf81de7324c932905bec2f8ee6bc4912aacd69d84897 Apr 23 19:04:46.878524 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:46.878493 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" event={"ID":"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f","Type":"ContainerStarted","Data":"23b0e128402d9aa166ceaf81de7324c932905bec2f8ee6bc4912aacd69d84897"} Apr 23 19:04:47.882554 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:47.882472 2581 generic.go:358] "Generic (PLEG): container finished" podID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" containerID="b6d8757b5e5706eb74f685e9432cb7bf4ac871c2d2e0104e97e3447f6313fee1" exitCode=0 Apr 23 19:04:47.882906 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:47.882566 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" event={"ID":"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f","Type":"ContainerDied","Data":"b6d8757b5e5706eb74f685e9432cb7bf4ac871c2d2e0104e97e3447f6313fee1"} Apr 23 19:04:48.888230 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:48.888194 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" event={"ID":"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f","Type":"ContainerStarted","Data":"c81abc8b6c89a9ad1c35f45238e9cd1a3c0beac2881913fcfd5c4035c4bf344d"} Apr 23 19:04:48.888230 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:48.888230 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" event={"ID":"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f","Type":"ContainerStarted","Data":"92adb9595033ab5f1701a1b62c4591c6a699159f837fa3a956c3ab4d44982fe7"} Apr 23 19:04:48.888742 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:48.888316 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:48.924025 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:48.923958 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" podStartSLOduration=2.923940683 podStartE2EDuration="2.923940683s" podCreationTimestamp="2026-04-23 19:04:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:04:48.91902565 +0000 UTC m=+5007.668467451" watchObservedRunningTime="2026-04-23 19:04:48.923940683 +0000 UTC m=+5007.673382483" Apr 23 19:04:56.706891 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:56.706852 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:56.707409 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:56.707029 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:56.709795 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:56.709774 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:04:56.915882 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:04:56.915845 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:05:18.922376 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:05:18.922296 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:07:09.165370 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:09.165338 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs"] Apr 23 19:07:09.165829 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:09.165663 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" podUID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" containerName="main" containerID="cri-o://92adb9595033ab5f1701a1b62c4591c6a699159f837fa3a956c3ab4d44982fe7" gracePeriod=30 Apr 23 19:07:09.165829 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:09.165707 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" podUID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" containerName="tokenizer" containerID="cri-o://c81abc8b6c89a9ad1c35f45238e9cd1a3c0beac2881913fcfd5c4035c4bf344d" gracePeriod=30 Apr 23 19:07:09.348569 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:09.348534 2581 generic.go:358] "Generic (PLEG): container finished" podID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" containerID="92adb9595033ab5f1701a1b62c4591c6a699159f837fa3a956c3ab4d44982fe7" exitCode=0 Apr 23 19:07:09.348735 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:09.348578 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" event={"ID":"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f","Type":"ContainerDied","Data":"92adb9595033ab5f1701a1b62c4591c6a699159f837fa3a956c3ab4d44982fe7"} Apr 23 19:07:10.354430 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.354389 2581 generic.go:358] "Generic (PLEG): container finished" podID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" containerID="c81abc8b6c89a9ad1c35f45238e9cd1a3c0beac2881913fcfd5c4035c4bf344d" exitCode=0 Apr 23 19:07:10.354751 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.354467 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" event={"ID":"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f","Type":"ContainerDied","Data":"c81abc8b6c89a9ad1c35f45238e9cd1a3c0beac2881913fcfd5c4035c4bf344d"} Apr 23 19:07:10.402632 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.402611 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:07:10.496707 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.496625 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-uds\") pod \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " Apr 23 19:07:10.496707 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.496667 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tls-certs\") pod \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " Apr 23 19:07:10.496891 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.496729 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-cache\") pod \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " Apr 23 19:07:10.496891 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.496774 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-tmp\") pod \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " Apr 23 19:07:10.496960 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.496906 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trgn4\" (UniqueName: \"kubernetes.io/projected/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-kube-api-access-trgn4\") pod \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " Apr 23 19:07:10.496960 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.496949 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" (UID: "128bf23c-8ee4-49e3-90d8-8d30f3a3da3f"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:07:10.497067 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.496954 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-kserve-provision-location\") pod \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\" (UID: \"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f\") " Apr 23 19:07:10.497067 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.497034 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" (UID: "128bf23c-8ee4-49e3-90d8-8d30f3a3da3f"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:07:10.497169 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.497101 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" (UID: "128bf23c-8ee4-49e3-90d8-8d30f3a3da3f"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:07:10.497282 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.497261 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-tmp\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:07:10.497282 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.497285 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-uds\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:07:10.497493 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.497298 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tokenizer-cache\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:07:10.497679 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.497656 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" (UID: "128bf23c-8ee4-49e3-90d8-8d30f3a3da3f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:07:10.499182 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.499160 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-kube-api-access-trgn4" (OuterVolumeSpecName: "kube-api-access-trgn4") pod "128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" (UID: "128bf23c-8ee4-49e3-90d8-8d30f3a3da3f"). InnerVolumeSpecName "kube-api-access-trgn4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:07:10.499268 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.499165 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" (UID: "128bf23c-8ee4-49e3-90d8-8d30f3a3da3f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:07:10.598399 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.598356 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trgn4\" (UniqueName: \"kubernetes.io/projected/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-kube-api-access-trgn4\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:07:10.598399 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.598393 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-kserve-provision-location\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:07:10.598399 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:10.598405 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f-tls-certs\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:07:11.359644 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.359608 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" event={"ID":"128bf23c-8ee4-49e3-90d8-8d30f3a3da3f","Type":"ContainerDied","Data":"23b0e128402d9aa166ceaf81de7324c932905bec2f8ee6bc4912aacd69d84897"} Apr 23 19:07:11.360107 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.359657 2581 scope.go:117] "RemoveContainer" containerID="c81abc8b6c89a9ad1c35f45238e9cd1a3c0beac2881913fcfd5c4035c4bf344d" Apr 23 19:07:11.360107 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.359673 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs" Apr 23 19:07:11.370353 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.370334 2581 scope.go:117] "RemoveContainer" containerID="92adb9595033ab5f1701a1b62c4591c6a699159f837fa3a956c3ab4d44982fe7" Apr 23 19:07:11.377678 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.377658 2581 scope.go:117] "RemoveContainer" containerID="b6d8757b5e5706eb74f685e9432cb7bf4ac871c2d2e0104e97e3447f6313fee1" Apr 23 19:07:11.389617 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.389587 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs"] Apr 23 19:07:11.394761 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.394740 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d547597f-zmfgs"] Apr 23 19:07:11.761063 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.761028 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-857fbd48cc-6tbhz"] Apr 23 19:07:11.761363 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.761352 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" containerName="tokenizer" Apr 23 19:07:11.761416 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.761365 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" containerName="tokenizer" Apr 23 19:07:11.761416 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.761401 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" containerName="main" Apr 23 19:07:11.761416 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.761407 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" containerName="main" Apr 23 19:07:11.761416 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.761415 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" containerName="storage-initializer" Apr 23 19:07:11.761564 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.761421 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" containerName="storage-initializer" Apr 23 19:07:11.761564 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.761477 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" containerName="main" Apr 23 19:07:11.761564 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.761484 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" containerName="tokenizer" Apr 23 19:07:11.764204 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.764183 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-857fbd48cc-6tbhz" Apr 23 19:07:11.776607 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.776585 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-857fbd48cc-6tbhz"] Apr 23 19:07:11.791008 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.790966 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128bf23c-8ee4-49e3-90d8-8d30f3a3da3f" path="/var/lib/kubelet/pods/128bf23c-8ee4-49e3-90d8-8d30f3a3da3f/volumes" Apr 23 19:07:11.907886 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.907851 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx7sr\" (UniqueName: \"kubernetes.io/projected/e83bff5e-9e58-46e3-9cfb-1489a575966f-kube-api-access-lx7sr\") pod \"llmisvc-controller-manager-857fbd48cc-6tbhz\" (UID: \"e83bff5e-9e58-46e3-9cfb-1489a575966f\") " pod="kserve/llmisvc-controller-manager-857fbd48cc-6tbhz" Apr 23 19:07:11.908087 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:11.907927 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e83bff5e-9e58-46e3-9cfb-1489a575966f-cert\") pod \"llmisvc-controller-manager-857fbd48cc-6tbhz\" (UID: \"e83bff5e-9e58-46e3-9cfb-1489a575966f\") " pod="kserve/llmisvc-controller-manager-857fbd48cc-6tbhz" Apr 23 19:07:12.008354 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:12.008319 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e83bff5e-9e58-46e3-9cfb-1489a575966f-cert\") pod \"llmisvc-controller-manager-857fbd48cc-6tbhz\" (UID: \"e83bff5e-9e58-46e3-9cfb-1489a575966f\") " pod="kserve/llmisvc-controller-manager-857fbd48cc-6tbhz" Apr 23 19:07:12.008494 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:12.008360 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lx7sr\" (UniqueName: \"kubernetes.io/projected/e83bff5e-9e58-46e3-9cfb-1489a575966f-kube-api-access-lx7sr\") pod \"llmisvc-controller-manager-857fbd48cc-6tbhz\" (UID: \"e83bff5e-9e58-46e3-9cfb-1489a575966f\") " pod="kserve/llmisvc-controller-manager-857fbd48cc-6tbhz" Apr 23 19:07:12.010755 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:12.010737 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e83bff5e-9e58-46e3-9cfb-1489a575966f-cert\") pod \"llmisvc-controller-manager-857fbd48cc-6tbhz\" (UID: \"e83bff5e-9e58-46e3-9cfb-1489a575966f\") " pod="kserve/llmisvc-controller-manager-857fbd48cc-6tbhz" Apr 23 19:07:12.017520 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:12.017462 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx7sr\" (UniqueName: \"kubernetes.io/projected/e83bff5e-9e58-46e3-9cfb-1489a575966f-kube-api-access-lx7sr\") pod \"llmisvc-controller-manager-857fbd48cc-6tbhz\" (UID: \"e83bff5e-9e58-46e3-9cfb-1489a575966f\") " pod="kserve/llmisvc-controller-manager-857fbd48cc-6tbhz" Apr 23 19:07:12.074252 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:12.074224 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-857fbd48cc-6tbhz" Apr 23 19:07:12.198017 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:12.197963 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-857fbd48cc-6tbhz"] Apr 23 19:07:12.201518 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:07:12.201482 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode83bff5e_9e58_46e3_9cfb_1489a575966f.slice/crio-ba5eec72d3559161aeed261ae2735bf2c52e4b357f5cd8ea3e8a1417cfd4149f WatchSource:0}: Error finding container ba5eec72d3559161aeed261ae2735bf2c52e4b357f5cd8ea3e8a1417cfd4149f: Status 404 returned error can't find the container with id ba5eec72d3559161aeed261ae2735bf2c52e4b357f5cd8ea3e8a1417cfd4149f Apr 23 19:07:12.203044 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:12.203027 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 19:07:12.364775 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:12.364683 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-857fbd48cc-6tbhz" event={"ID":"e83bff5e-9e58-46e3-9cfb-1489a575966f","Type":"ContainerStarted","Data":"ba5eec72d3559161aeed261ae2735bf2c52e4b357f5cd8ea3e8a1417cfd4149f"} Apr 23 19:07:13.369530 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:13.369495 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-857fbd48cc-6tbhz" event={"ID":"e83bff5e-9e58-46e3-9cfb-1489a575966f","Type":"ContainerStarted","Data":"48de4b73863bdc0e979895ef818e3389358ee4a4b85f20996a59e077882e9aab"} Apr 23 19:07:13.369903 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:13.369609 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-857fbd48cc-6tbhz" Apr 23 19:07:13.389105 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:13.389049 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-857fbd48cc-6tbhz" podStartSLOduration=1.914142121 podStartE2EDuration="2.389034369s" podCreationTimestamp="2026-04-23 19:07:11 +0000 UTC" firstStartedPulling="2026-04-23 19:07:12.20315213 +0000 UTC m=+5150.952593909" lastFinishedPulling="2026-04-23 19:07:12.678044368 +0000 UTC m=+5151.427486157" observedRunningTime="2026-04-23 19:07:13.388349335 +0000 UTC m=+5152.137791139" watchObservedRunningTime="2026-04-23 19:07:13.389034369 +0000 UTC m=+5152.138476170" Apr 23 19:07:44.375088 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:44.375057 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-857fbd48cc-6tbhz" Apr 23 19:07:44.427828 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:44.427796 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb"] Apr 23 19:07:44.428091 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:44.428070 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" podUID="054b45bb-dfb8-45b7-a704-26b96dd9c2fe" containerName="manager" containerID="cri-o://add0226a0538bea08fbb1133f1093025a60ac2817ebaf4633f99c93482a259b7" gracePeriod=30 Apr 23 19:07:44.662686 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:44.662666 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" Apr 23 19:07:44.791818 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:44.791782 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-cert\") pod \"054b45bb-dfb8-45b7-a704-26b96dd9c2fe\" (UID: \"054b45bb-dfb8-45b7-a704-26b96dd9c2fe\") " Apr 23 19:07:44.792037 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:44.791862 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pr9g\" (UniqueName: \"kubernetes.io/projected/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-kube-api-access-5pr9g\") pod \"054b45bb-dfb8-45b7-a704-26b96dd9c2fe\" (UID: \"054b45bb-dfb8-45b7-a704-26b96dd9c2fe\") " Apr 23 19:07:44.793929 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:44.793899 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-cert" (OuterVolumeSpecName: "cert") pod "054b45bb-dfb8-45b7-a704-26b96dd9c2fe" (UID: "054b45bb-dfb8-45b7-a704-26b96dd9c2fe"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:07:44.794055 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:44.793917 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-kube-api-access-5pr9g" (OuterVolumeSpecName: "kube-api-access-5pr9g") pod "054b45bb-dfb8-45b7-a704-26b96dd9c2fe" (UID: "054b45bb-dfb8-45b7-a704-26b96dd9c2fe"). InnerVolumeSpecName "kube-api-access-5pr9g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:07:44.893386 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:44.893350 2581 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-cert\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:07:44.893386 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:44.893383 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5pr9g\" (UniqueName: \"kubernetes.io/projected/054b45bb-dfb8-45b7-a704-26b96dd9c2fe-kube-api-access-5pr9g\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:07:45.477970 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:45.477933 2581 generic.go:358] "Generic (PLEG): container finished" podID="054b45bb-dfb8-45b7-a704-26b96dd9c2fe" containerID="add0226a0538bea08fbb1133f1093025a60ac2817ebaf4633f99c93482a259b7" exitCode=0 Apr 23 19:07:45.478396 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:45.478017 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" Apr 23 19:07:45.478396 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:45.478017 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" event={"ID":"054b45bb-dfb8-45b7-a704-26b96dd9c2fe","Type":"ContainerDied","Data":"add0226a0538bea08fbb1133f1093025a60ac2817ebaf4633f99c93482a259b7"} Apr 23 19:07:45.478396 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:45.478062 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb" event={"ID":"054b45bb-dfb8-45b7-a704-26b96dd9c2fe","Type":"ContainerDied","Data":"fae4fe523d6acc9e46096f5f59355d9b65176f223f8e476d4e69dd0c17d09007"} Apr 23 19:07:45.478396 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:45.478082 2581 scope.go:117] "RemoveContainer" containerID="add0226a0538bea08fbb1133f1093025a60ac2817ebaf4633f99c93482a259b7" Apr 23 19:07:45.486533 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:45.486518 2581 scope.go:117] "RemoveContainer" containerID="add0226a0538bea08fbb1133f1093025a60ac2817ebaf4633f99c93482a259b7" Apr 23 19:07:45.486780 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:07:45.486762 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add0226a0538bea08fbb1133f1093025a60ac2817ebaf4633f99c93482a259b7\": container with ID starting with add0226a0538bea08fbb1133f1093025a60ac2817ebaf4633f99c93482a259b7 not found: ID does not exist" containerID="add0226a0538bea08fbb1133f1093025a60ac2817ebaf4633f99c93482a259b7" Apr 23 19:07:45.486829 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:45.486790 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add0226a0538bea08fbb1133f1093025a60ac2817ebaf4633f99c93482a259b7"} err="failed to get container status \"add0226a0538bea08fbb1133f1093025a60ac2817ebaf4633f99c93482a259b7\": rpc error: code = NotFound desc = could not find container \"add0226a0538bea08fbb1133f1093025a60ac2817ebaf4633f99c93482a259b7\": container with ID starting with add0226a0538bea08fbb1133f1093025a60ac2817ebaf4633f99c93482a259b7 not found: ID does not exist" Apr 23 19:07:45.500306 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:45.500283 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb"] Apr 23 19:07:45.504035 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:45.504014 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-56c5fd6bcc-x6vxb"] Apr 23 19:07:45.791447 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:07:45.791367 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="054b45bb-dfb8-45b7-a704-26b96dd9c2fe" path="/var/lib/kubelet/pods/054b45bb-dfb8-45b7-a704-26b96dd9c2fe/volumes" Apr 23 19:09:52.659807 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.659719 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb"] Apr 23 19:09:52.660264 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.660191 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="054b45bb-dfb8-45b7-a704-26b96dd9c2fe" containerName="manager" Apr 23 19:09:52.660264 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.660207 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="054b45bb-dfb8-45b7-a704-26b96dd9c2fe" containerName="manager" Apr 23 19:09:52.660338 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.660307 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="054b45bb-dfb8-45b7-a704-26b96dd9c2fe" containerName="manager" Apr 23 19:09:52.663153 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.663133 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.666380 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.666340 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 19:09:52.666480 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.666465 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-v6p42\"" Apr 23 19:09:52.667810 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.667788 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-78dl4\"" Apr 23 19:09:52.667925 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.667889 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 19:09:52.668044 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.668022 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 23 19:09:52.679537 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.679511 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb"] Apr 23 19:09:52.712774 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.712740 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d069536b-aaa6-418b-800f-657d37161d09-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.712774 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.712778 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.713016 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.712800 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kghn\" (UniqueName: \"kubernetes.io/projected/d069536b-aaa6-418b-800f-657d37161d09-kube-api-access-2kghn\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.713016 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.712874 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.713016 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.712934 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.713016 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.712972 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.813662 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.813626 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d069536b-aaa6-418b-800f-657d37161d09-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.813662 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.813662 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.813898 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.813687 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kghn\" (UniqueName: \"kubernetes.io/projected/d069536b-aaa6-418b-800f-657d37161d09-kube-api-access-2kghn\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.813898 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.813711 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.813898 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.813760 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.813898 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.813793 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.814154 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.814133 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.814214 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.814167 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.814214 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.814192 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.814288 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.814224 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.816396 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.816374 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d069536b-aaa6-418b-800f-657d37161d09-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.824070 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.824042 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kghn\" (UniqueName: \"kubernetes.io/projected/d069536b-aaa6-418b-800f-657d37161d09-kube-api-access-2kghn\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:52.972912 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:52.972827 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:53.102213 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:53.102187 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb"] Apr 23 19:09:53.104260 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:09:53.104222 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd069536b_aaa6_418b_800f_657d37161d09.slice/crio-5958168d2bd23b9476db44b6301ff706a8ec89e28faf6df0c83eebe47354740b WatchSource:0}: Error finding container 5958168d2bd23b9476db44b6301ff706a8ec89e28faf6df0c83eebe47354740b: Status 404 returned error can't find the container with id 5958168d2bd23b9476db44b6301ff706a8ec89e28faf6df0c83eebe47354740b Apr 23 19:09:53.880882 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:53.880843 2581 generic.go:358] "Generic (PLEG): container finished" podID="d069536b-aaa6-418b-800f-657d37161d09" containerID="8342534459fb88db1c2c3a50cf21268e9afc7aecdfa2cdf37ce36e1a9f0db5fa" exitCode=0 Apr 23 19:09:53.881382 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:53.880881 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" event={"ID":"d069536b-aaa6-418b-800f-657d37161d09","Type":"ContainerDied","Data":"8342534459fb88db1c2c3a50cf21268e9afc7aecdfa2cdf37ce36e1a9f0db5fa"} Apr 23 19:09:53.881382 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:53.880918 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" event={"ID":"d069536b-aaa6-418b-800f-657d37161d09","Type":"ContainerStarted","Data":"5958168d2bd23b9476db44b6301ff706a8ec89e28faf6df0c83eebe47354740b"} Apr 23 19:09:54.887407 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:54.887362 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" event={"ID":"d069536b-aaa6-418b-800f-657d37161d09","Type":"ContainerStarted","Data":"eac1f1f2994057533c849bfb328c78d23300dc3c59c7bed6b70630cdf9297f62"} Apr 23 19:09:54.887859 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:54.887415 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" event={"ID":"d069536b-aaa6-418b-800f-657d37161d09","Type":"ContainerStarted","Data":"0d8a613dfe55f0cb08b2d8aef097369e6e3e47e9dbf574ece262cbc3dbe6f104"} Apr 23 19:09:54.888602 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:54.888572 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:09:54.917780 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:09:54.917719 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" podStartSLOduration=2.9177012429999998 podStartE2EDuration="2.917701243s" podCreationTimestamp="2026-04-23 19:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:09:54.913528718 +0000 UTC m=+5313.662970520" watchObservedRunningTime="2026-04-23 19:09:54.917701243 +0000 UTC m=+5313.667143045" Apr 23 19:10:02.973640 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:10:02.973606 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:10:02.974127 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:10:02.973652 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:10:02.976594 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:10:02.976569 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:10:03.923118 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:10:03.923083 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:10:24.927020 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:10:24.926966 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:13:24.518619 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:24.518578 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb"] Apr 23 19:13:24.519110 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:24.519052 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" podUID="d069536b-aaa6-418b-800f-657d37161d09" containerName="main" containerID="cri-o://0d8a613dfe55f0cb08b2d8aef097369e6e3e47e9dbf574ece262cbc3dbe6f104" gracePeriod=30 Apr 23 19:13:24.519110 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:24.519061 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" podUID="d069536b-aaa6-418b-800f-657d37161d09" containerName="tokenizer" containerID="cri-o://eac1f1f2994057533c849bfb328c78d23300dc3c59c7bed6b70630cdf9297f62" gracePeriod=30 Apr 23 19:13:24.925552 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:13:24.925516 2581 logging.go:55] [core] [Channel #299 SubChannel #300]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.36:9003", ServerName: "10.132.0.36:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.36:9003: connect: connection refused" Apr 23 19:13:25.566805 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.566772 2581 generic.go:358] "Generic (PLEG): container finished" podID="d069536b-aaa6-418b-800f-657d37161d09" containerID="0d8a613dfe55f0cb08b2d8aef097369e6e3e47e9dbf574ece262cbc3dbe6f104" exitCode=0 Apr 23 19:13:25.567152 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.566817 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" event={"ID":"d069536b-aaa6-418b-800f-657d37161d09","Type":"ContainerDied","Data":"0d8a613dfe55f0cb08b2d8aef097369e6e3e47e9dbf574ece262cbc3dbe6f104"} Apr 23 19:13:25.767716 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.767696 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:13:25.811699 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.811622 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-kserve-provision-location\") pod \"d069536b-aaa6-418b-800f-657d37161d09\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " Apr 23 19:13:25.811839 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.811721 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kghn\" (UniqueName: \"kubernetes.io/projected/d069536b-aaa6-418b-800f-657d37161d09-kube-api-access-2kghn\") pod \"d069536b-aaa6-418b-800f-657d37161d09\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " Apr 23 19:13:25.811839 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.811751 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-tmp\") pod \"d069536b-aaa6-418b-800f-657d37161d09\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " Apr 23 19:13:25.811839 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.811805 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d069536b-aaa6-418b-800f-657d37161d09-tls-certs\") pod \"d069536b-aaa6-418b-800f-657d37161d09\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " Apr 23 19:13:25.811839 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.811832 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-cache\") pod \"d069536b-aaa6-418b-800f-657d37161d09\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " Apr 23 19:13:25.812091 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.811865 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-uds\") pod \"d069536b-aaa6-418b-800f-657d37161d09\" (UID: \"d069536b-aaa6-418b-800f-657d37161d09\") " Apr 23 19:13:25.812213 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.812181 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d069536b-aaa6-418b-800f-657d37161d09" (UID: "d069536b-aaa6-418b-800f-657d37161d09"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:13:25.812213 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.812191 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d069536b-aaa6-418b-800f-657d37161d09" (UID: "d069536b-aaa6-418b-800f-657d37161d09"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:13:25.812367 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.812344 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d069536b-aaa6-418b-800f-657d37161d09" (UID: "d069536b-aaa6-418b-800f-657d37161d09"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:13:25.812593 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.812451 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d069536b-aaa6-418b-800f-657d37161d09" (UID: "d069536b-aaa6-418b-800f-657d37161d09"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:13:25.814063 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.814040 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d069536b-aaa6-418b-800f-657d37161d09-kube-api-access-2kghn" (OuterVolumeSpecName: "kube-api-access-2kghn") pod "d069536b-aaa6-418b-800f-657d37161d09" (UID: "d069536b-aaa6-418b-800f-657d37161d09"). InnerVolumeSpecName "kube-api-access-2kghn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:13:25.814180 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.814154 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d069536b-aaa6-418b-800f-657d37161d09-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d069536b-aaa6-418b-800f-657d37161d09" (UID: "d069536b-aaa6-418b-800f-657d37161d09"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:13:25.913118 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.913082 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-kserve-provision-location\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:13:25.913118 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.913118 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2kghn\" (UniqueName: \"kubernetes.io/projected/d069536b-aaa6-418b-800f-657d37161d09-kube-api-access-2kghn\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:13:25.913316 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.913134 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-tmp\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:13:25.913316 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.913148 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d069536b-aaa6-418b-800f-657d37161d09-tls-certs\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:13:25.913316 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.913161 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-cache\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:13:25.913316 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.913175 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d069536b-aaa6-418b-800f-657d37161d09-tokenizer-uds\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:13:25.926036 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:25.925997 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" podUID="d069536b-aaa6-418b-800f-657d37161d09" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.36:9003\" within 1s: context deadline exceeded" Apr 23 19:13:25.926213 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:13:25.926085 2581 logging.go:55] [core] [Channel #299 SubChannel #300]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.36:9003", ServerName: "10.132.0.36:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.36:9003: operation was canceled" Apr 23 19:13:26.572164 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.572127 2581 generic.go:358] "Generic (PLEG): container finished" podID="d069536b-aaa6-418b-800f-657d37161d09" containerID="eac1f1f2994057533c849bfb328c78d23300dc3c59c7bed6b70630cdf9297f62" exitCode=0 Apr 23 19:13:26.572602 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.572176 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" event={"ID":"d069536b-aaa6-418b-800f-657d37161d09","Type":"ContainerDied","Data":"eac1f1f2994057533c849bfb328c78d23300dc3c59c7bed6b70630cdf9297f62"} Apr 23 19:13:26.572602 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.572209 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" event={"ID":"d069536b-aaa6-418b-800f-657d37161d09","Type":"ContainerDied","Data":"5958168d2bd23b9476db44b6301ff706a8ec89e28faf6df0c83eebe47354740b"} Apr 23 19:13:26.572602 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.572226 2581 scope.go:117] "RemoveContainer" containerID="eac1f1f2994057533c849bfb328c78d23300dc3c59c7bed6b70630cdf9297f62" Apr 23 19:13:26.572602 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.572261 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb" Apr 23 19:13:26.580700 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.580631 2581 scope.go:117] "RemoveContainer" containerID="0d8a613dfe55f0cb08b2d8aef097369e6e3e47e9dbf574ece262cbc3dbe6f104" Apr 23 19:13:26.588053 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.588031 2581 scope.go:117] "RemoveContainer" containerID="8342534459fb88db1c2c3a50cf21268e9afc7aecdfa2cdf37ce36e1a9f0db5fa" Apr 23 19:13:26.595214 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.595190 2581 scope.go:117] "RemoveContainer" containerID="eac1f1f2994057533c849bfb328c78d23300dc3c59c7bed6b70630cdf9297f62" Apr 23 19:13:26.595482 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:13:26.595461 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac1f1f2994057533c849bfb328c78d23300dc3c59c7bed6b70630cdf9297f62\": container with ID starting with eac1f1f2994057533c849bfb328c78d23300dc3c59c7bed6b70630cdf9297f62 not found: ID does not exist" containerID="eac1f1f2994057533c849bfb328c78d23300dc3c59c7bed6b70630cdf9297f62" Apr 23 19:13:26.595609 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.595492 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac1f1f2994057533c849bfb328c78d23300dc3c59c7bed6b70630cdf9297f62"} err="failed to get container status \"eac1f1f2994057533c849bfb328c78d23300dc3c59c7bed6b70630cdf9297f62\": rpc error: code = NotFound desc = could not find container \"eac1f1f2994057533c849bfb328c78d23300dc3c59c7bed6b70630cdf9297f62\": container with ID starting with eac1f1f2994057533c849bfb328c78d23300dc3c59c7bed6b70630cdf9297f62 not found: ID does not exist" Apr 23 19:13:26.595609 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.595520 2581 scope.go:117] "RemoveContainer" containerID="0d8a613dfe55f0cb08b2d8aef097369e6e3e47e9dbf574ece262cbc3dbe6f104" Apr 23 19:13:26.595801 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:13:26.595762 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8a613dfe55f0cb08b2d8aef097369e6e3e47e9dbf574ece262cbc3dbe6f104\": container with ID starting with 0d8a613dfe55f0cb08b2d8aef097369e6e3e47e9dbf574ece262cbc3dbe6f104 not found: ID does not exist" containerID="0d8a613dfe55f0cb08b2d8aef097369e6e3e47e9dbf574ece262cbc3dbe6f104" Apr 23 19:13:26.595842 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.595811 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8a613dfe55f0cb08b2d8aef097369e6e3e47e9dbf574ece262cbc3dbe6f104"} err="failed to get container status \"0d8a613dfe55f0cb08b2d8aef097369e6e3e47e9dbf574ece262cbc3dbe6f104\": rpc error: code = NotFound desc = could not find container \"0d8a613dfe55f0cb08b2d8aef097369e6e3e47e9dbf574ece262cbc3dbe6f104\": container with ID starting with 0d8a613dfe55f0cb08b2d8aef097369e6e3e47e9dbf574ece262cbc3dbe6f104 not found: ID does not exist" Apr 23 19:13:26.595842 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.595834 2581 scope.go:117] "RemoveContainer" containerID="8342534459fb88db1c2c3a50cf21268e9afc7aecdfa2cdf37ce36e1a9f0db5fa" Apr 23 19:13:26.596126 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:13:26.596100 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8342534459fb88db1c2c3a50cf21268e9afc7aecdfa2cdf37ce36e1a9f0db5fa\": container with ID starting with 8342534459fb88db1c2c3a50cf21268e9afc7aecdfa2cdf37ce36e1a9f0db5fa not found: ID does not exist" containerID="8342534459fb88db1c2c3a50cf21268e9afc7aecdfa2cdf37ce36e1a9f0db5fa" Apr 23 19:13:26.596233 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.596132 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8342534459fb88db1c2c3a50cf21268e9afc7aecdfa2cdf37ce36e1a9f0db5fa"} err="failed to get container status \"8342534459fb88db1c2c3a50cf21268e9afc7aecdfa2cdf37ce36e1a9f0db5fa\": rpc error: code = NotFound desc = could not find container \"8342534459fb88db1c2c3a50cf21268e9afc7aecdfa2cdf37ce36e1a9f0db5fa\": container with ID starting with 8342534459fb88db1c2c3a50cf21268e9afc7aecdfa2cdf37ce36e1a9f0db5fa not found: ID does not exist" Apr 23 19:13:26.596301 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.596271 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb"] Apr 23 19:13:26.601606 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:26.601584 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-85998xvffb"] Apr 23 19:13:27.791382 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:27.791355 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d069536b-aaa6-418b-800f-657d37161d09" path="/var/lib/kubelet/pods/d069536b-aaa6-418b-800f-657d37161d09/volumes" Apr 23 19:13:38.531393 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.531355 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq"] Apr 23 19:13:38.531883 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.531676 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d069536b-aaa6-418b-800f-657d37161d09" containerName="main" Apr 23 19:13:38.531883 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.531687 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d069536b-aaa6-418b-800f-657d37161d09" containerName="main" Apr 23 19:13:38.531883 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.531699 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d069536b-aaa6-418b-800f-657d37161d09" containerName="storage-initializer" Apr 23 19:13:38.531883 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.531715 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d069536b-aaa6-418b-800f-657d37161d09" containerName="storage-initializer" Apr 23 19:13:38.531883 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.531733 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d069536b-aaa6-418b-800f-657d37161d09" containerName="tokenizer" Apr 23 19:13:38.531883 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.531749 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d069536b-aaa6-418b-800f-657d37161d09" containerName="tokenizer" Apr 23 19:13:38.531883 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.531861 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="d069536b-aaa6-418b-800f-657d37161d09" containerName="tokenizer" Apr 23 19:13:38.531883 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.531881 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="d069536b-aaa6-418b-800f-657d37161d09" containerName="main" Apr 23 19:13:38.536686 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.536666 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.539890 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.539858 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 23 19:13:38.539890 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.539879 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-plkvc\"" Apr 23 19:13:38.541033 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.541017 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 19:13:38.541185 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.541167 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-78dl4\"" Apr 23 19:13:38.541226 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.541193 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 19:13:38.550187 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.550164 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq"] Apr 23 19:13:38.624418 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.624386 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.624622 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.624426 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.624622 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.624455 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5104118-098a-47f1-93a7-b2465dfdf1ec-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.624622 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.624556 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nmxm\" (UniqueName: \"kubernetes.io/projected/d5104118-098a-47f1-93a7-b2465dfdf1ec-kube-api-access-7nmxm\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.624622 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.624601 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.624813 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.624672 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.726144 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.726098 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.726144 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.726151 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.726398 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.726172 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.726398 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.726200 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5104118-098a-47f1-93a7-b2465dfdf1ec-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.726398 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.726234 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nmxm\" (UniqueName: \"kubernetes.io/projected/d5104118-098a-47f1-93a7-b2465dfdf1ec-kube-api-access-7nmxm\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.726398 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.726252 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.726692 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.726554 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.726692 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.726583 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.726692 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.726623 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.726692 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.726652 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.729016 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.728976 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5104118-098a-47f1-93a7-b2465dfdf1ec-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.738493 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.738465 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nmxm\" (UniqueName: \"kubernetes.io/projected/d5104118-098a-47f1-93a7-b2465dfdf1ec-kube-api-access-7nmxm\") pod \"router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.846228 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.846145 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:38.998180 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:38.998153 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq"] Apr 23 19:13:38.998865 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:13:38.998842 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5104118_098a_47f1_93a7_b2465dfdf1ec.slice/crio-266a47c1f78882674a8cdd606b83a17398c736a4d0c8d76d25c9693f30f2a0a9 WatchSource:0}: Error finding container 266a47c1f78882674a8cdd606b83a17398c736a4d0c8d76d25c9693f30f2a0a9: Status 404 returned error can't find the container with id 266a47c1f78882674a8cdd606b83a17398c736a4d0c8d76d25c9693f30f2a0a9 Apr 23 19:13:39.000817 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:39.000803 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 19:13:39.618827 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:39.618795 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" event={"ID":"d5104118-098a-47f1-93a7-b2465dfdf1ec","Type":"ContainerStarted","Data":"3ad692a9204cab8893507775064fe6168d6b4f55bcfad96e1adae2647d8b0546"} Apr 23 19:13:39.618827 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:39.618835 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" event={"ID":"d5104118-098a-47f1-93a7-b2465dfdf1ec","Type":"ContainerStarted","Data":"266a47c1f78882674a8cdd606b83a17398c736a4d0c8d76d25c9693f30f2a0a9"} Apr 23 19:13:40.623334 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:40.623289 2581 generic.go:358] "Generic (PLEG): container finished" podID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerID="3ad692a9204cab8893507775064fe6168d6b4f55bcfad96e1adae2647d8b0546" exitCode=0 Apr 23 19:13:40.623728 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:40.623368 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" event={"ID":"d5104118-098a-47f1-93a7-b2465dfdf1ec","Type":"ContainerDied","Data":"3ad692a9204cab8893507775064fe6168d6b4f55bcfad96e1adae2647d8b0546"} Apr 23 19:13:41.628700 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:41.628662 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" event={"ID":"d5104118-098a-47f1-93a7-b2465dfdf1ec","Type":"ContainerStarted","Data":"82ed6c816bc1e526aabe1e965fa00a8af268745b50d02010ffb509d1f92ab53f"} Apr 23 19:13:41.628700 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:41.628702 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" event={"ID":"d5104118-098a-47f1-93a7-b2465dfdf1ec","Type":"ContainerStarted","Data":"a7db9d66385d30f18a1438c1a3b32c6e0a92a8fcc4e2035a2e7b84f1a3788902"} Apr 23 19:13:41.629156 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:41.628799 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:41.659807 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:41.659752 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" podStartSLOduration=3.659733787 podStartE2EDuration="3.659733787s" podCreationTimestamp="2026-04-23 19:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:13:41.657635422 +0000 UTC m=+5540.407077222" watchObservedRunningTime="2026-04-23 19:13:41.659733787 +0000 UTC m=+5540.409175588" Apr 23 19:13:48.846293 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:48.846253 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:48.846293 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:48.846296 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:48.849008 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:48.848964 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:13:49.659887 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:13:49.659849 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:14:10.664781 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:14:10.664701 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:16:20.193347 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:20.193313 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq"] Apr 23 19:16:20.193789 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:20.193721 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" podUID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerName="main" containerID="cri-o://a7db9d66385d30f18a1438c1a3b32c6e0a92a8fcc4e2035a2e7b84f1a3788902" gracePeriod=30 Apr 23 19:16:20.193858 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:20.193790 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" podUID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerName="tokenizer" containerID="cri-o://82ed6c816bc1e526aabe1e965fa00a8af268745b50d02010ffb509d1f92ab53f" gracePeriod=30 Apr 23 19:16:20.663920 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:16:20.663891 2581 logging.go:55] [core] [Channel #388 SubChannel #389]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.37:9003", ServerName: "10.132.0.37:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.37:9003: connect: connection refused" Apr 23 19:16:21.144882 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.144853 2581 generic.go:358] "Generic (PLEG): container finished" podID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerID="a7db9d66385d30f18a1438c1a3b32c6e0a92a8fcc4e2035a2e7b84f1a3788902" exitCode=0 Apr 23 19:16:21.145057 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.144922 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" event={"ID":"d5104118-098a-47f1-93a7-b2465dfdf1ec","Type":"ContainerDied","Data":"a7db9d66385d30f18a1438c1a3b32c6e0a92a8fcc4e2035a2e7b84f1a3788902"} Apr 23 19:16:21.433766 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.433745 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:16:21.582427 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.582395 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-kserve-provision-location\") pod \"d5104118-098a-47f1-93a7-b2465dfdf1ec\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " Apr 23 19:16:21.582619 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.582465 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5104118-098a-47f1-93a7-b2465dfdf1ec-tls-certs\") pod \"d5104118-098a-47f1-93a7-b2465dfdf1ec\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " Apr 23 19:16:21.582619 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.582494 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-cache\") pod \"d5104118-098a-47f1-93a7-b2465dfdf1ec\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " Apr 23 19:16:21.582619 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.582510 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nmxm\" (UniqueName: \"kubernetes.io/projected/d5104118-098a-47f1-93a7-b2465dfdf1ec-kube-api-access-7nmxm\") pod \"d5104118-098a-47f1-93a7-b2465dfdf1ec\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " Apr 23 19:16:21.582619 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.582528 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-tmp\") pod \"d5104118-098a-47f1-93a7-b2465dfdf1ec\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " Apr 23 19:16:21.582619 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.582578 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-uds\") pod \"d5104118-098a-47f1-93a7-b2465dfdf1ec\" (UID: \"d5104118-098a-47f1-93a7-b2465dfdf1ec\") " Apr 23 19:16:21.582866 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.582753 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d5104118-098a-47f1-93a7-b2465dfdf1ec" (UID: "d5104118-098a-47f1-93a7-b2465dfdf1ec"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:21.583051 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.583011 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d5104118-098a-47f1-93a7-b2465dfdf1ec" (UID: "d5104118-098a-47f1-93a7-b2465dfdf1ec"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:21.583145 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.583049 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d5104118-098a-47f1-93a7-b2465dfdf1ec" (UID: "d5104118-098a-47f1-93a7-b2465dfdf1ec"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:21.583406 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.583350 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d5104118-098a-47f1-93a7-b2465dfdf1ec" (UID: "d5104118-098a-47f1-93a7-b2465dfdf1ec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:21.584881 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.584859 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5104118-098a-47f1-93a7-b2465dfdf1ec-kube-api-access-7nmxm" (OuterVolumeSpecName: "kube-api-access-7nmxm") pod "d5104118-098a-47f1-93a7-b2465dfdf1ec" (UID: "d5104118-098a-47f1-93a7-b2465dfdf1ec"). InnerVolumeSpecName "kube-api-access-7nmxm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:16:21.584881 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.584860 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5104118-098a-47f1-93a7-b2465dfdf1ec-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d5104118-098a-47f1-93a7-b2465dfdf1ec" (UID: "d5104118-098a-47f1-93a7-b2465dfdf1ec"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:16:21.663750 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.663707 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" podUID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.37:9003\" within 1s: context deadline exceeded" Apr 23 19:16:21.683764 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.683655 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5104118-098a-47f1-93a7-b2465dfdf1ec-tls-certs\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:16:21.683764 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.683701 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-cache\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:16:21.683764 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.683715 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7nmxm\" (UniqueName: \"kubernetes.io/projected/d5104118-098a-47f1-93a7-b2465dfdf1ec-kube-api-access-7nmxm\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:16:21.683764 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.683727 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-tmp\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:16:21.683764 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.683739 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-tokenizer-uds\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:16:21.683764 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:21.683753 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5104118-098a-47f1-93a7-b2465dfdf1ec-kserve-provision-location\") on node \"ip-10-0-143-218.ec2.internal\" DevicePath \"\"" Apr 23 19:16:22.151466 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.151435 2581 generic.go:358] "Generic (PLEG): container finished" podID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerID="82ed6c816bc1e526aabe1e965fa00a8af268745b50d02010ffb509d1f92ab53f" exitCode=0 Apr 23 19:16:22.151658 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.151533 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" event={"ID":"d5104118-098a-47f1-93a7-b2465dfdf1ec","Type":"ContainerDied","Data":"82ed6c816bc1e526aabe1e965fa00a8af268745b50d02010ffb509d1f92ab53f"} Apr 23 19:16:22.151658 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.151566 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" event={"ID":"d5104118-098a-47f1-93a7-b2465dfdf1ec","Type":"ContainerDied","Data":"266a47c1f78882674a8cdd606b83a17398c736a4d0c8d76d25c9693f30f2a0a9"} Apr 23 19:16:22.151658 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.151581 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq" Apr 23 19:16:22.151836 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.151588 2581 scope.go:117] "RemoveContainer" containerID="82ed6c816bc1e526aabe1e965fa00a8af268745b50d02010ffb509d1f92ab53f" Apr 23 19:16:22.166045 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.166017 2581 scope.go:117] "RemoveContainer" containerID="a7db9d66385d30f18a1438c1a3b32c6e0a92a8fcc4e2035a2e7b84f1a3788902" Apr 23 19:16:22.173385 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.173366 2581 scope.go:117] "RemoveContainer" containerID="3ad692a9204cab8893507775064fe6168d6b4f55bcfad96e1adae2647d8b0546" Apr 23 19:16:22.180382 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.180360 2581 scope.go:117] "RemoveContainer" containerID="82ed6c816bc1e526aabe1e965fa00a8af268745b50d02010ffb509d1f92ab53f" Apr 23 19:16:22.180716 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:16:22.180693 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ed6c816bc1e526aabe1e965fa00a8af268745b50d02010ffb509d1f92ab53f\": container with ID starting with 82ed6c816bc1e526aabe1e965fa00a8af268745b50d02010ffb509d1f92ab53f not found: ID does not exist" containerID="82ed6c816bc1e526aabe1e965fa00a8af268745b50d02010ffb509d1f92ab53f" Apr 23 19:16:22.180814 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.180724 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ed6c816bc1e526aabe1e965fa00a8af268745b50d02010ffb509d1f92ab53f"} err="failed to get container status \"82ed6c816bc1e526aabe1e965fa00a8af268745b50d02010ffb509d1f92ab53f\": rpc error: code = NotFound desc = could not find container \"82ed6c816bc1e526aabe1e965fa00a8af268745b50d02010ffb509d1f92ab53f\": container with ID starting with 82ed6c816bc1e526aabe1e965fa00a8af268745b50d02010ffb509d1f92ab53f not found: ID does not exist" Apr 23 19:16:22.180814 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.180749 2581 scope.go:117] "RemoveContainer" containerID="a7db9d66385d30f18a1438c1a3b32c6e0a92a8fcc4e2035a2e7b84f1a3788902" Apr 23 19:16:22.181048 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:16:22.181030 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7db9d66385d30f18a1438c1a3b32c6e0a92a8fcc4e2035a2e7b84f1a3788902\": container with ID starting with a7db9d66385d30f18a1438c1a3b32c6e0a92a8fcc4e2035a2e7b84f1a3788902 not found: ID does not exist" containerID="a7db9d66385d30f18a1438c1a3b32c6e0a92a8fcc4e2035a2e7b84f1a3788902" Apr 23 19:16:22.181122 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.181054 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7db9d66385d30f18a1438c1a3b32c6e0a92a8fcc4e2035a2e7b84f1a3788902"} err="failed to get container status \"a7db9d66385d30f18a1438c1a3b32c6e0a92a8fcc4e2035a2e7b84f1a3788902\": rpc error: code = NotFound desc = could not find container \"a7db9d66385d30f18a1438c1a3b32c6e0a92a8fcc4e2035a2e7b84f1a3788902\": container with ID starting with a7db9d66385d30f18a1438c1a3b32c6e0a92a8fcc4e2035a2e7b84f1a3788902 not found: ID does not exist" Apr 23 19:16:22.181122 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.181069 2581 scope.go:117] "RemoveContainer" containerID="3ad692a9204cab8893507775064fe6168d6b4f55bcfad96e1adae2647d8b0546" Apr 23 19:16:22.181351 ip-10-0-143-218 kubenswrapper[2581]: E0423 19:16:22.181318 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad692a9204cab8893507775064fe6168d6b4f55bcfad96e1adae2647d8b0546\": container with ID starting with 3ad692a9204cab8893507775064fe6168d6b4f55bcfad96e1adae2647d8b0546 not found: ID does not exist" containerID="3ad692a9204cab8893507775064fe6168d6b4f55bcfad96e1adae2647d8b0546" Apr 23 19:16:22.181448 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.181377 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad692a9204cab8893507775064fe6168d6b4f55bcfad96e1adae2647d8b0546"} err="failed to get container status \"3ad692a9204cab8893507775064fe6168d6b4f55bcfad96e1adae2647d8b0546\": rpc error: code = NotFound desc = could not find container \"3ad692a9204cab8893507775064fe6168d6b4f55bcfad96e1adae2647d8b0546\": container with ID starting with 3ad692a9204cab8893507775064fe6168d6b4f55bcfad96e1adae2647d8b0546 not found: ID does not exist" Apr 23 19:16:22.182126 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.182106 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq"] Apr 23 19:16:22.187815 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:22.187791 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-668db9fc8bmjbq"] Apr 23 19:16:23.790397 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:23.790362 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5104118-098a-47f1-93a7-b2465dfdf1ec" path="/var/lib/kubelet/pods/d5104118-098a-47f1-93a7-b2465dfdf1ec/volumes" Apr 23 19:16:52.155031 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:52.154978 2581 ???:1] "http2: server: error reading preface from client 10.0.137.68:50224: read tcp 10.0.143.218:10250->10.0.137.68:50224: read: connection reset by peer" Apr 23 19:16:52.174638 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:52.174607 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-s54pc_9c520546-a736-40a3-8afe-578967729121/manager/0.log" Apr 23 19:16:54.589846 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.589809 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j4sk6/must-gather-gr2p9"] Apr 23 19:16:54.590234 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.590219 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerName="main" Apr 23 19:16:54.590234 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.590232 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerName="main" Apr 23 19:16:54.590321 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.590242 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerName="storage-initializer" Apr 23 19:16:54.590321 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.590248 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerName="storage-initializer" Apr 23 19:16:54.590321 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.590259 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerName="tokenizer" Apr 23 19:16:54.590321 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.590265 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerName="tokenizer" Apr 23 19:16:54.590462 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.590330 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerName="main" Apr 23 19:16:54.590462 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.590341 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5104118-098a-47f1-93a7-b2465dfdf1ec" containerName="tokenizer" Apr 23 19:16:54.593382 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.593367 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4sk6/must-gather-gr2p9" Apr 23 19:16:54.596208 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.596190 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-j4sk6\"/\"openshift-service-ca.crt\"" Apr 23 19:16:54.598023 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.598004 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-j4sk6\"/\"default-dockercfg-tvs57\"" Apr 23 19:16:54.598082 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.598008 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-j4sk6\"/\"kube-root-ca.crt\"" Apr 23 19:16:54.606575 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.606550 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4sk6/must-gather-gr2p9"] Apr 23 19:16:54.667157 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.667128 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2vrn\" (UniqueName: \"kubernetes.io/projected/cc17be2a-f5a9-43a3-a14f-ec40840d2ef6-kube-api-access-b2vrn\") pod \"must-gather-gr2p9\" (UID: \"cc17be2a-f5a9-43a3-a14f-ec40840d2ef6\") " pod="openshift-must-gather-j4sk6/must-gather-gr2p9" Apr 23 19:16:54.667286 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.667175 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cc17be2a-f5a9-43a3-a14f-ec40840d2ef6-must-gather-output\") pod \"must-gather-gr2p9\" (UID: \"cc17be2a-f5a9-43a3-a14f-ec40840d2ef6\") " pod="openshift-must-gather-j4sk6/must-gather-gr2p9" Apr 23 19:16:54.767804 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.767775 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2vrn\" (UniqueName: \"kubernetes.io/projected/cc17be2a-f5a9-43a3-a14f-ec40840d2ef6-kube-api-access-b2vrn\") pod \"must-gather-gr2p9\" (UID: \"cc17be2a-f5a9-43a3-a14f-ec40840d2ef6\") " pod="openshift-must-gather-j4sk6/must-gather-gr2p9" Apr 23 19:16:54.767936 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.767822 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cc17be2a-f5a9-43a3-a14f-ec40840d2ef6-must-gather-output\") pod \"must-gather-gr2p9\" (UID: \"cc17be2a-f5a9-43a3-a14f-ec40840d2ef6\") " pod="openshift-must-gather-j4sk6/must-gather-gr2p9" Apr 23 19:16:54.768168 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.768153 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cc17be2a-f5a9-43a3-a14f-ec40840d2ef6-must-gather-output\") pod \"must-gather-gr2p9\" (UID: \"cc17be2a-f5a9-43a3-a14f-ec40840d2ef6\") " pod="openshift-must-gather-j4sk6/must-gather-gr2p9" Apr 23 19:16:54.780509 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.780481 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2vrn\" (UniqueName: \"kubernetes.io/projected/cc17be2a-f5a9-43a3-a14f-ec40840d2ef6-kube-api-access-b2vrn\") pod \"must-gather-gr2p9\" (UID: \"cc17be2a-f5a9-43a3-a14f-ec40840d2ef6\") " pod="openshift-must-gather-j4sk6/must-gather-gr2p9" Apr 23 19:16:54.902307 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:54.902278 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4sk6/must-gather-gr2p9" Apr 23 19:16:55.053385 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:55.053358 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4sk6/must-gather-gr2p9"] Apr 23 19:16:55.055118 ip-10-0-143-218 kubenswrapper[2581]: W0423 19:16:55.055091 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc17be2a_f5a9_43a3_a14f_ec40840d2ef6.slice/crio-58ba32b07b91126fa5297e0ea4bf238ab4b9323d3268c0674127cc881e512030 WatchSource:0}: Error finding container 58ba32b07b91126fa5297e0ea4bf238ab4b9323d3268c0674127cc881e512030: Status 404 returned error can't find the container with id 58ba32b07b91126fa5297e0ea4bf238ab4b9323d3268c0674127cc881e512030 Apr 23 19:16:55.256556 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:55.256477 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4sk6/must-gather-gr2p9" event={"ID":"cc17be2a-f5a9-43a3-a14f-ec40840d2ef6","Type":"ContainerStarted","Data":"58ba32b07b91126fa5297e0ea4bf238ab4b9323d3268c0674127cc881e512030"} Apr 23 19:16:56.261148 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:56.261098 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4sk6/must-gather-gr2p9" event={"ID":"cc17be2a-f5a9-43a3-a14f-ec40840d2ef6","Type":"ContainerStarted","Data":"95357c16dfda8bc7d40008e0c02e52e3ee2f685f40b18d4a72c91d5b622efde5"} Apr 23 19:16:56.261148 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:56.261143 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4sk6/must-gather-gr2p9" event={"ID":"cc17be2a-f5a9-43a3-a14f-ec40840d2ef6","Type":"ContainerStarted","Data":"718e5edaf4cc2ff937e6bcd8ba60e1980bf99e9ae987c062c7e9c2310bbe7538"} Apr 23 19:16:56.288097 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:56.288041 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j4sk6/must-gather-gr2p9" podStartSLOduration=1.3766585949999999 podStartE2EDuration="2.288024173s" podCreationTimestamp="2026-04-23 19:16:54 +0000 UTC" firstStartedPulling="2026-04-23 19:16:55.056817344 +0000 UTC m=+5733.806259123" lastFinishedPulling="2026-04-23 19:16:55.968182922 +0000 UTC m=+5734.717624701" observedRunningTime="2026-04-23 19:16:56.286083106 +0000 UTC m=+5735.035524908" watchObservedRunningTime="2026-04-23 19:16:56.288024173 +0000 UTC m=+5735.037465973" Apr 23 19:16:57.748353 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:57.748327 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-p5w64_89eab3e6-08c4-458c-99c5-72cda238f366/global-pull-secret-syncer/0.log" Apr 23 19:16:57.916821 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:57.916793 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-w6l25_fa6065cc-b8a5-43b6-b34c-abd80570d3dc/konnectivity-agent/0.log" Apr 23 19:16:58.006696 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:16:58.006602 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-218.ec2.internal_0686775b0acf60aa77b819482efe0489/haproxy/0.log" Apr 23 19:17:01.767410 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:01.767382 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-s54pc_9c520546-a736-40a3-8afe-578967729121/manager/0.log" Apr 23 19:17:03.329859 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.329782 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-j7zhr_09b5f2cb-9870-4567-9f10-2c51e6e0ae4a/monitoring-plugin/0.log" Apr 23 19:17:03.457850 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.457787 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-d67bp_21c80999-7322-4870-ae8a-5f6957e51999/node-exporter/0.log" Apr 23 19:17:03.482391 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.482354 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-d67bp_21c80999-7322-4870-ae8a-5f6957e51999/kube-rbac-proxy/0.log" Apr 23 19:17:03.508999 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.508945 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-d67bp_21c80999-7322-4870-ae8a-5f6957e51999/init-textfile/0.log" Apr 23 19:17:03.618540 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.618434 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cms96_2d55cc26-3696-48b3-88f6-736b38316363/kube-rbac-proxy-main/0.log" Apr 23 19:17:03.643685 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.643632 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cms96_2d55cc26-3696-48b3-88f6-736b38316363/kube-rbac-proxy-self/0.log" Apr 23 19:17:03.668232 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.668166 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cms96_2d55cc26-3696-48b3-88f6-736b38316363/openshift-state-metrics/0.log" Apr 23 19:17:03.721502 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.721441 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753ac946-1a00-4695-be26-362b5112d541/prometheus/0.log" Apr 23 19:17:03.741373 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.741348 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753ac946-1a00-4695-be26-362b5112d541/config-reloader/0.log" Apr 23 19:17:03.774460 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.774397 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753ac946-1a00-4695-be26-362b5112d541/thanos-sidecar/0.log" Apr 23 19:17:03.798060 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.798001 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753ac946-1a00-4695-be26-362b5112d541/kube-rbac-proxy-web/0.log" Apr 23 19:17:03.828640 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.828614 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753ac946-1a00-4695-be26-362b5112d541/kube-rbac-proxy/0.log" Apr 23 19:17:03.871366 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.871296 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753ac946-1a00-4695-be26-362b5112d541/kube-rbac-proxy-thanos/0.log" Apr 23 19:17:03.895156 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.895122 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753ac946-1a00-4695-be26-362b5112d541/init-config-reloader/0.log" Apr 23 19:17:03.933639 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.932972 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9dbkc_32480919-8b56-4ecd-9904-1e7674fb05bd/prometheus-operator/0.log" Apr 23 19:17:03.956335 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:03.956260 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9dbkc_32480919-8b56-4ecd-9904-1e7674fb05bd/kube-rbac-proxy/0.log" Apr 23 19:17:04.047841 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:04.047813 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7df7f7d9c5-97595_35919eee-76f1-461d-b532-97c823ac5955/telemeter-client/0.log" Apr 23 19:17:04.074551 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:04.074520 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7df7f7d9c5-97595_35919eee-76f1-461d-b532-97c823ac5955/reload/0.log" Apr 23 19:17:04.111171 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:04.111131 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7df7f7d9c5-97595_35919eee-76f1-461d-b532-97c823ac5955/kube-rbac-proxy/0.log" Apr 23 19:17:05.525872 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:05.525846 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-mndcn_4c1413a9-cb8c-4052-b6e8-03a3e48b07e4/networking-console-plugin/0.log" Apr 23 19:17:06.196571 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.196541 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t"] Apr 23 19:17:06.200798 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.200774 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.214518 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.214485 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t"] Apr 23 19:17:06.284241 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.284193 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-proc\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.284241 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.284243 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-sys\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.284491 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.284279 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wphpw\" (UniqueName: \"kubernetes.io/projected/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-kube-api-access-wphpw\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.284491 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.284378 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-lib-modules\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.284491 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.284451 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-podres\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.385771 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.385737 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-lib-modules\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.385937 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.385799 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-podres\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.385937 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.385832 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-proc\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.385937 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.385849 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-sys\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.385937 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.385882 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wphpw\" (UniqueName: \"kubernetes.io/projected/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-kube-api-access-wphpw\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.385937 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.385916 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-lib-modules\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.386148 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.385936 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-sys\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.386148 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.385937 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-proc\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.386148 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.385961 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-podres\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.396114 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.396085 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wphpw\" (UniqueName: \"kubernetes.io/projected/9dc42fd6-5fa6-4ade-bc80-45827f7cba6d-kube-api-access-wphpw\") pod \"perf-node-gather-daemonset-swh2t\" (UID: \"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.511189 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.511110 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:06.651775 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:06.651737 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t"] Apr 23 19:17:07.327420 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:07.327376 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" event={"ID":"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d","Type":"ContainerStarted","Data":"a2c5dcc98622dfb2822504845e59417460bd5cad8a34cccdd85f2009a7d59b36"} Apr 23 19:17:07.327420 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:07.327427 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" event={"ID":"9dc42fd6-5fa6-4ade-bc80-45827f7cba6d","Type":"ContainerStarted","Data":"525996b6cf841beceef2499ca6eee1fad946f3199874165a308312da6d4bf9bb"} Apr 23 19:17:07.328228 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:07.328201 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:07.351461 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:07.351404 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" podStartSLOduration=1.351385824 podStartE2EDuration="1.351385824s" podCreationTimestamp="2026-04-23 19:17:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:17:07.348774936 +0000 UTC m=+5746.098216738" watchObservedRunningTime="2026-04-23 19:17:07.351385824 +0000 UTC m=+5746.100827626" Apr 23 19:17:07.950858 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:07.950811 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lg22b_3160553d-dd23-411d-afee-16ff4b1a7ca7/dns/0.log" Apr 23 19:17:07.974355 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:07.974324 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lg22b_3160553d-dd23-411d-afee-16ff4b1a7ca7/kube-rbac-proxy/0.log" Apr 23 19:17:08.025004 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:08.024952 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mtx92_8e736d0c-df1a-43ad-8265-6b52f1993b03/dns-node-resolver/0.log" Apr 23 19:17:08.668974 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:08.668945 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6lc2b_600448d4-6de5-4739-b9f6-281d9613622c/node-ca/0.log" Apr 23 19:17:10.250309 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:10.250279 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jw7bs_e21a3d10-f4dc-4d03-8e8e-3718d1cb9ef9/serve-healthcheck-canary/0.log" Apr 23 19:17:10.734193 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:10.734165 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-6gf74_58c83603-7870-44e5-ab1f-6751e80f8562/insights-operator/1.log" Apr 23 19:17:10.734776 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:10.734751 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-6gf74_58c83603-7870-44e5-ab1f-6751e80f8562/insights-operator/0.log" Apr 23 19:17:10.758107 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:10.758085 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cl2hj_c5a8d4ca-dd18-4b64-918c-68c8b1037169/kube-rbac-proxy/0.log" Apr 23 19:17:10.782499 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:10.782471 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cl2hj_c5a8d4ca-dd18-4b64-918c-68c8b1037169/exporter/0.log" Apr 23 19:17:10.806874 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:10.806848 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cl2hj_c5a8d4ca-dd18-4b64-918c-68c8b1037169/extractor/0.log" Apr 23 19:17:13.600358 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:13.600329 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7bc7774745-77s29_942a95ab-2e00-4c6a-b0df-4c42d578f689/manager/0.log" Apr 23 19:17:14.196974 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:14.196949 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6fc5d867c5-mgfdv_c73bb32b-bb03-4a22-9ddd-60b15154a275/manager/0.log" Apr 23 19:17:14.277085 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:14.277044 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-857fbd48cc-6tbhz_e83bff5e-9e58-46e3-9cfb-1489a575966f/manager/0.log" Apr 23 19:17:14.345922 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:14.345886 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-swh2t" Apr 23 19:17:14.532178 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:14.532094 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-kptkb_40ecb4f8-7561-4652-beab-d1fc66645999/manager/0.log" Apr 23 19:17:21.069290 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:21.069262 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9c58k_28f30ab5-30f5-42ed-881e-4af2ffff85dc/kube-multus/0.log" Apr 23 19:17:21.521223 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:21.521194 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xcnjl_cc5f4800-7f3a-482a-8059-189a6072df5a/kube-multus-additional-cni-plugins/0.log" Apr 23 19:17:21.548885 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:21.548859 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xcnjl_cc5f4800-7f3a-482a-8059-189a6072df5a/egress-router-binary-copy/0.log" Apr 23 19:17:21.575350 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:21.575321 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xcnjl_cc5f4800-7f3a-482a-8059-189a6072df5a/cni-plugins/0.log" Apr 23 19:17:21.599775 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:21.599747 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xcnjl_cc5f4800-7f3a-482a-8059-189a6072df5a/bond-cni-plugin/0.log" Apr 23 19:17:21.624217 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:21.624189 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xcnjl_cc5f4800-7f3a-482a-8059-189a6072df5a/routeoverride-cni/0.log" Apr 23 19:17:21.648577 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:21.648552 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xcnjl_cc5f4800-7f3a-482a-8059-189a6072df5a/whereabouts-cni-bincopy/0.log" Apr 23 19:17:21.672495 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:21.672447 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xcnjl_cc5f4800-7f3a-482a-8059-189a6072df5a/whereabouts-cni/0.log" Apr 23 19:17:21.830531 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:21.830456 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dtrg9_34a08e1b-c531-4efa-80dc-60379820b460/network-metrics-daemon/0.log" Apr 23 19:17:21.852698 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:21.852674 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dtrg9_34a08e1b-c531-4efa-80dc-60379820b460/kube-rbac-proxy/0.log" Apr 23 19:17:22.735731 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:22.735701 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b27s8_7a841a79-6082-4722-b8d8-9b6626187a1f/ovn-controller/0.log" Apr 23 19:17:22.781873 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:22.781818 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b27s8_7a841a79-6082-4722-b8d8-9b6626187a1f/ovn-acl-logging/0.log" Apr 23 19:17:22.805001 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:22.804960 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b27s8_7a841a79-6082-4722-b8d8-9b6626187a1f/kube-rbac-proxy-node/0.log" Apr 23 19:17:22.830824 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:22.830796 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b27s8_7a841a79-6082-4722-b8d8-9b6626187a1f/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 19:17:22.853265 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:22.853209 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b27s8_7a841a79-6082-4722-b8d8-9b6626187a1f/northd/0.log" Apr 23 19:17:22.876797 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:22.876760 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b27s8_7a841a79-6082-4722-b8d8-9b6626187a1f/nbdb/0.log" Apr 23 19:17:22.900817 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:22.900781 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b27s8_7a841a79-6082-4722-b8d8-9b6626187a1f/sbdb/0.log" Apr 23 19:17:23.022211 ip-10-0-143-218 kubenswrapper[2581]: I0423 19:17:23.022095 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b27s8_7a841a79-6082-4722-b8d8-9b6626187a1f/ovnkube-controller/0.log"