Apr 16 14:50:07.824247 ip-10-0-140-83 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 14:50:07.824259 ip-10-0-140-83 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 14:50:07.824266 ip-10-0-140-83 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 14:50:07.824494 ip-10-0-140-83 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 14:50:18.045195 ip-10-0-140-83 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 14:50:18.045213 ip-10-0-140-83 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot a634bd0cff52488296ea31fa69e803df -- Apr 16 14:52:41.110180 ip-10-0-140-83 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:52:41.594054 ip-10-0-140-83 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:41.594054 ip-10-0-140-83 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:52:41.594054 ip-10-0-140-83 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:41.594054 ip-10-0-140-83 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:52:41.594054 ip-10-0-140-83 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:41.597686 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.597623 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:52:41.601575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601561 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:41.601575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601575 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601581 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601584 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601587 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601589 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601592 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601595 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601597 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601603 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601606 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601609 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601611 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601614 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601616 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601619 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601622 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601624 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601627 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601630 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601632 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:41.601642 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601635 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601637 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601640 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601642 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601645 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601648 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601650 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601655 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601659 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601663 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601667 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601670 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601674 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601677 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601679 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601683 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601685 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601688 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601690 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:41.602142 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601693 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601695 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601697 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601700 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601702 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601705 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601707 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601710 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601712 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601715 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601717 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601720 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601732 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601736 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601738 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601742 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601745 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601747 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601750 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601753 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:41.602589 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601756 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601758 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601761 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601763 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601766 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601768 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601771 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601774 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601777 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601779 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601781 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601784 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601786 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601789 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601791 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601794 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601796 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601798 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601801 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601804 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:41.603123 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601807 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601810 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601813 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601815 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601818 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.601820 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602187 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602192 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602195 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602198 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602200 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602204 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602206 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602209 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602211 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602214 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602217 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602219 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602222 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602225 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:41.603605 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602227 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602230 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602234 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602238 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602240 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602243 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602245 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602248 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602251 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602253 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602255 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602258 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602261 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602263 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602266 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602268 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602271 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602274 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602277 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:41.604104 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602280 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602283 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602285 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602288 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602291 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602294 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602297 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602300 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602302 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602305 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602307 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602310 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602313 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602315 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602318 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602321 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602323 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602326 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602328 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602331 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:41.604575 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602334 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602336 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602339 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602341 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602344 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602346 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602349 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602351 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602353 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602356 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602358 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602361 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602364 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602367 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602369 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602372 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602374 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602377 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602380 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602382 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:41.605081 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602385 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602387 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602390 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602393 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602396 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602399 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602401 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602404 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602406 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602409 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602412 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602414 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.602417 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604435 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604448 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604454 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604459 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604463 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604466 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604470 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604474 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:52:41.605585 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604478 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604481 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604484 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604489 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604492 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604495 2568 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604498 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604501 2568 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604504 2568 flags.go:64] FLAG: --cloud-config="" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604507 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604509 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604513 2568 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604516 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604519 2568 flags.go:64] FLAG: --config-dir="" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604522 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604525 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604528 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604532 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604535 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604538 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604541 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604544 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604547 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604550 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604553 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:52:41.606130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604557 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604560 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604563 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604565 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604568 2568 flags.go:64] FLAG: --enable-server="true" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604571 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604575 2568 flags.go:64] FLAG: --event-burst="100" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604579 2568 flags.go:64] FLAG: --event-qps="50" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604582 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604585 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604589 2568 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604593 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604596 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604599 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604602 2568 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604605 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604607 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604611 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604614 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604617 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604619 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604622 2568 flags.go:64] FLAG: --feature-gates="" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604626 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604629 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604631 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:52:41.606820 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604635 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604639 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604642 2568 flags.go:64] FLAG: --help="false" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604645 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-140-83.ec2.internal" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604648 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604651 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604654 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604657 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604660 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604663 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604666 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604668 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604671 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604674 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604677 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604680 2568 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604684 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604687 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604691 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604694 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604696 2568 flags.go:64] FLAG: --lock-file="" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604699 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604702 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604705 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:52:41.607466 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604711 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604713 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604716 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604719 2568 flags.go:64] FLAG: --logging-format="text" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604722 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604725 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604728 2568 flags.go:64] FLAG: --manifest-url="" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604730 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604735 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604738 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604742 2568 flags.go:64] FLAG: --max-pods="110" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604745 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604748 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604751 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604754 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604757 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604759 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604762 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604769 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604772 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604775 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604778 2568 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604781 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:52:41.608100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604786 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604789 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604793 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604796 2568 flags.go:64] FLAG: --port="10250" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604798 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604801 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03d4e623a73f43267" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604804 2568 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604807 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604810 2568 flags.go:64] FLAG: --register-node="true" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604813 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604816 2568 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604819 2568 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604822 2568 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604825 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604828 2568 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604831 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604834 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604837 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604840 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604843 2568 flags.go:64] FLAG: --runonce="false" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604846 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604849 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604852 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604855 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604857 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604860 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:52:41.608651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604863 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604866 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604869 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604872 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604874 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604877 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604880 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604883 2568 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604886 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604904 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604906 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604909 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604913 2568 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604916 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604918 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604921 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604924 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604927 2568 flags.go:64] FLAG: --v="2" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604930 2568 flags.go:64] FLAG: --version="false" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604934 2568 flags.go:64] FLAG: --vmodule="" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604938 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.604941 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605037 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605042 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:41.609299 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605045 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605048 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605050 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605053 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605055 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605059 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605061 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605064 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605066 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605069 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605071 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605074 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605076 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605079 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605081 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605084 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605089 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605091 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605094 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:41.609877 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605097 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605099 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605102 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605105 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605107 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605110 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605112 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605115 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605117 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605120 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605122 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605125 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605127 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605129 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605132 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605134 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605137 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605141 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605144 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605147 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:41.610382 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605150 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605152 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605155 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605157 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605160 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605164 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605167 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605171 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605173 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605178 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605181 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605183 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605186 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605189 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605191 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605193 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605196 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605198 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605201 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605204 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:41.610945 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605206 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605209 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605211 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605214 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605216 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605219 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605221 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605223 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605226 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605229 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605231 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605234 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605236 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605239 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605241 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605244 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605246 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605249 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605251 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605253 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:41.611467 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605256 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605262 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605264 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605267 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.605269 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.605932 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.611762 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.611776 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611821 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611826 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611829 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611832 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611835 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611838 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611841 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:41.611958 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611844 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611847 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611849 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611853 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611858 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611862 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611864 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611867 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611870 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611872 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611875 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611878 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611880 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611883 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611886 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611903 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611906 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611909 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611912 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:41.612320 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611914 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611918 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611920 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611923 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611926 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611931 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611934 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611936 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611939 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611941 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611944 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611947 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611950 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611952 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611955 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611958 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611961 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611963 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611966 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611969 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:41.612790 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611972 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611974 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611977 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611979 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611983 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611987 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611990 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611992 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611995 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.611998 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612001 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612003 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612006 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612009 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612011 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612014 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612017 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612019 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612023 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612025 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:41.613307 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612028 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612031 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612033 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612036 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612038 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612041 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612044 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612047 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612049 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612052 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612054 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612057 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612059 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612062 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612064 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612067 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612069 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612072 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612074 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:41.613809 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612077 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.612082 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612171 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612175 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612179 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612184 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612188 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612190 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612193 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612196 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612199 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612202 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612205 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612207 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612210 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:41.614296 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612213 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612215 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612218 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612220 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612223 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612225 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612228 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612230 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612233 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612236 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612239 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612243 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612245 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612248 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612250 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612253 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612255 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612258 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612260 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:41.614663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612263 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612265 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612268 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612271 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612274 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612276 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612279 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612282 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612284 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612287 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612289 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612292 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612295 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612297 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612299 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612302 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612305 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612307 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612310 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612312 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:41.615205 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612314 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612317 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612320 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612322 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612325 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612328 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612331 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612333 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612336 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612339 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612341 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612344 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612346 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612348 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612351 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612354 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612357 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612360 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612362 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612365 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:41.615689 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612367 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612370 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612373 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612375 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612378 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612381 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612383 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612386 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612388 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612391 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612393 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612396 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612398 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:41.612401 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.612405 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:41.616176 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.613226 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:52:41.621379 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.621359 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:52:41.622719 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.622706 2568 server.go:1019] "Starting client certificate rotation" Apr 16 14:52:41.622828 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.622811 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:41.622960 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.622863 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:41.650751 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.650728 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:41.653511 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.653494 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:41.668204 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.668186 2568 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:52:41.675873 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.675858 2568 log.go:25] "Validated CRI v1 image API" Apr 16 14:52:41.677733 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.677714 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:52:41.681616 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.681600 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:41.682780 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.682764 2568 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b8bf4bbb-131a-4c15-a7ae-03b1b26eeb79:/dev/nvme0n1p3 fce7b1ec-823f-44f3-9462-64183c133654:/dev/nvme0n1p4] Apr 16 14:52:41.682835 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.682780 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:52:41.689201 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.689094 2568 manager.go:217] Machine: {Timestamp:2026-04-16 14:52:41.687101484 +0000 UTC m=+0.451053586 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101045 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20cc152e560b046fd3e37e8338c033 SystemUUID:ec20cc15-2e56-0b04-6fd3-e37e8338c033 BootID:a634bd0c-ff52-4882-96ea-31fa69e803df Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f2:89:d8:a2:2d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f2:89:d8:a2:2d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:42:b5:5a:06:ba Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:52:41.689201 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.689187 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:52:41.689356 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.689252 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:52:41.690240 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.690216 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:52:41.690393 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.690243 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-83.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:52:41.690475 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.690407 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:52:41.690475 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.690420 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:52:41.690475 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.690438 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:41.690475 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.690456 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:41.691984 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.691971 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:41.692213 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.692200 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:52:41.694628 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.694617 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:52:41.694693 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.694635 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:52:41.694693 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.694656 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:52:41.694693 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.694668 2568 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:52:41.694693 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.694680 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:52:41.695739 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.695726 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:41.695809 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.695749 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:41.699471 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.699457 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:52:41.701284 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.701272 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:52:41.702783 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.702771 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:52:41.702843 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.702788 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:52:41.702843 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.702794 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:52:41.702843 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.702799 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:52:41.702843 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.702804 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:52:41.702843 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.702810 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:52:41.702843 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.702816 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:52:41.702843 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.702821 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:52:41.702843 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.702828 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:52:41.702843 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.702833 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:52:41.702843 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.702841 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:52:41.703120 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.702849 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:52:41.703683 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.703674 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:52:41.703683 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.703683 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:52:41.704696 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.704676 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-shm5x" Apr 16 14:52:41.706408 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.706394 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-83.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 14:52:41.706751 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.706731 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-83.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:52:41.706791 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.706737 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:52:41.706997 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.706985 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:52:41.707039 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.707018 2568 server.go:1295] "Started kubelet" Apr 16 14:52:41.707138 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.707087 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:52:41.707633 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.707581 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:52:41.707732 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.707651 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:52:41.707700 ip-10-0-140-83 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:52:41.708199 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.708115 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:52:41.709297 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.709277 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:52:41.712848 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.711884 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-83.ec2.internal.18a6ddf855e5e548 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-83.ec2.internal,UID:ip-10-0-140-83.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-83.ec2.internal,},FirstTimestamp:2026-04-16 14:52:41.706997064 +0000 UTC m=+0.470949168,LastTimestamp:2026-04-16 14:52:41.706997064 +0000 UTC m=+0.470949168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-83.ec2.internal,}" Apr 16 14:52:41.714160 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.714134 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-shm5x" Apr 16 14:52:41.714160 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.714156 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:41.714732 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.714713 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:52:41.715492 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.715471 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:52:41.715492 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.715492 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:52:41.715626 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.715608 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:52:41.715626 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.715617 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:52:41.715716 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.715668 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:52:41.715878 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.715860 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:41.717239 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.716748 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:52:41.717324 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.717309 2568 factory.go:55] Registering systemd factory Apr 16 14:52:41.717375 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.717327 2568 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:52:41.717653 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.717627 2568 factory.go:153] Registering CRI-O factory Apr 16 14:52:41.717653 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.717643 2568 factory.go:223] Registration of the crio container factory successfully Apr 16 14:52:41.717787 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.717666 2568 factory.go:103] Registering Raw factory Apr 16 14:52:41.717787 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.717682 2568 manager.go:1196] Started watching for new ooms in manager Apr 16 14:52:41.718244 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.718150 2568 manager.go:319] Starting recovery of all containers Apr 16 14:52:41.719294 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.719275 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:52:41.729479 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.729387 2568 manager.go:324] Recovery completed Apr 16 14:52:41.729692 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.729676 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:41.732776 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.732760 2568 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-140-83.ec2.internal\" not found" node="ip-10-0-140-83.ec2.internal" Apr 16 14:52:41.733458 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.733446 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:41.737420 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.737408 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:41.737477 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.737432 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:41.737477 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.737442 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:41.737849 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.737831 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:52:41.737849 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.737842 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:52:41.737977 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.737857 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:41.740327 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.740317 2568 policy_none.go:49] "None policy: Start" Apr 16 14:52:41.740366 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.740330 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:52:41.740366 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.740339 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:52:41.780448 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.780434 2568 manager.go:341] "Starting Device Plugin manager" Apr 16 14:52:41.796531 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.780465 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:52:41.796531 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.780476 2568 server.go:85] "Starting device plugin registration server" Apr 16 14:52:41.796531 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.780682 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:52:41.796531 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.780696 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:52:41.796531 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.780773 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:52:41.796531 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.780827 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:52:41.796531 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.780833 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:52:41.796531 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.781367 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:52:41.796531 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.781411 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:41.857129 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.857076 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:52:41.858238 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.858222 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:52:41.858281 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.858252 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:52:41.858281 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.858274 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:52:41.858362 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.858282 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:52:41.858362 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.858318 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:52:41.860903 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.860877 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:41.881250 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.881235 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:41.882042 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.882026 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:41.882128 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.882051 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:41.882128 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.882068 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:41.882128 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.882086 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-83.ec2.internal" Apr 16 14:52:41.891798 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.891783 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-83.ec2.internal" Apr 16 14:52:41.891847 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.891803 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-83.ec2.internal\": node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:41.918176 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.918159 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:41.959013 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.958992 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-83.ec2.internal"] Apr 16 14:52:41.959089 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.959048 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:41.960852 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.960839 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:41.960932 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.960870 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:41.960932 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.960880 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:41.962397 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.962385 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:41.962560 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.962548 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal" Apr 16 14:52:41.962605 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.962573 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:41.963106 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.963087 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:41.963192 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.963113 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:41.963192 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.963128 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:41.963192 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.963136 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:41.963192 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.963116 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:41.963324 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.963216 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:41.964954 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.964941 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-83.ec2.internal" Apr 16 14:52:41.965004 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.964965 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:41.965690 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.965668 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:41.965769 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.965705 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:41.965769 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:41.965736 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:41.988432 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.988419 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-83.ec2.internal\" not found" node="ip-10-0-140-83.ec2.internal" Apr 16 14:52:41.991872 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:41.991856 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-83.ec2.internal\" not found" node="ip-10-0-140-83.ec2.internal" Apr 16 14:52:42.018721 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:42.018700 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:42.117489 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.117430 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/187d013988eca3fa2c04c9f72ca0653f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal\" (UID: \"187d013988eca3fa2c04c9f72ca0653f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal" Apr 16 14:52:42.117591 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.117484 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/187d013988eca3fa2c04c9f72ca0653f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal\" (UID: \"187d013988eca3fa2c04c9f72ca0653f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal" Apr 16 14:52:42.117591 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.117552 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/66ca7287f809a0a0d2312e14abcec99e-config\") pod \"kube-apiserver-proxy-ip-10-0-140-83.ec2.internal\" (UID: \"66ca7287f809a0a0d2312e14abcec99e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-83.ec2.internal" Apr 16 14:52:42.119505 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:42.119491 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:42.218138 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.218118 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/187d013988eca3fa2c04c9f72ca0653f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal\" (UID: \"187d013988eca3fa2c04c9f72ca0653f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal" Apr 16 14:52:42.218219 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.218143 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/187d013988eca3fa2c04c9f72ca0653f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal\" (UID: \"187d013988eca3fa2c04c9f72ca0653f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal" Apr 16 14:52:42.218219 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.218161 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/66ca7287f809a0a0d2312e14abcec99e-config\") pod \"kube-apiserver-proxy-ip-10-0-140-83.ec2.internal\" (UID: \"66ca7287f809a0a0d2312e14abcec99e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-83.ec2.internal" Apr 16 14:52:42.218219 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.218185 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/66ca7287f809a0a0d2312e14abcec99e-config\") pod \"kube-apiserver-proxy-ip-10-0-140-83.ec2.internal\" (UID: \"66ca7287f809a0a0d2312e14abcec99e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-83.ec2.internal" Apr 16 14:52:42.218219 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.218202 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/187d013988eca3fa2c04c9f72ca0653f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal\" (UID: \"187d013988eca3fa2c04c9f72ca0653f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal" Apr 16 14:52:42.218369 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.218236 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/187d013988eca3fa2c04c9f72ca0653f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal\" (UID: \"187d013988eca3fa2c04c9f72ca0653f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal" Apr 16 14:52:42.220224 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:42.220207 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:42.292437 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.292414 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal" Apr 16 14:52:42.295102 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.295079 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-83.ec2.internal" Apr 16 14:52:42.320878 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:42.320848 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:42.421429 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:42.421376 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:42.521942 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:42.521922 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:42.622491 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:42.622468 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:42.622491 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.622470 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:52:42.622944 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.622602 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:42.622944 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.622627 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:42.715212 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.715194 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:42.715966 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.715943 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:47:41 +0000 UTC" deadline="2027-09-24 07:32:17.123290278 +0000 UTC" Apr 16 14:52:42.716035 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.715978 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12616h39m34.407328701s" Apr 16 14:52:42.723451 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:42.723433 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:42.729278 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.729262 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:42.758667 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.758647 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9mvw6" Apr 16 14:52:42.767557 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.767541 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9mvw6" Apr 16 14:52:42.805982 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:42.805951 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66ca7287f809a0a0d2312e14abcec99e.slice/crio-c22a695d36f9a04af00bcff89a6ddc9f497510fd4ab0db591ebd133302ad9a3a WatchSource:0}: Error finding container c22a695d36f9a04af00bcff89a6ddc9f497510fd4ab0db591ebd133302ad9a3a: Status 404 returned error can't find the container with id c22a695d36f9a04af00bcff89a6ddc9f497510fd4ab0db591ebd133302ad9a3a Apr 16 14:52:42.806275 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:42.806252 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod187d013988eca3fa2c04c9f72ca0653f.slice/crio-5c6e65e1a6afaaa3db75e65d98e5dce3de5a235a7559f8731d212420cf921b73 WatchSource:0}: Error finding container 5c6e65e1a6afaaa3db75e65d98e5dce3de5a235a7559f8731d212420cf921b73: Status 404 returned error can't find the container with id 5c6e65e1a6afaaa3db75e65d98e5dce3de5a235a7559f8731d212420cf921b73 Apr 16 14:52:42.812016 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.812002 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:42.824159 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:42.824138 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:42.861431 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.861388 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal" event={"ID":"187d013988eca3fa2c04c9f72ca0653f","Type":"ContainerStarted","Data":"5c6e65e1a6afaaa3db75e65d98e5dce3de5a235a7559f8731d212420cf921b73"} Apr 16 14:52:42.862318 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:42.862299 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-83.ec2.internal" event={"ID":"66ca7287f809a0a0d2312e14abcec99e","Type":"ContainerStarted","Data":"c22a695d36f9a04af00bcff89a6ddc9f497510fd4ab0db591ebd133302ad9a3a"} Apr 16 14:52:42.924823 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:42.924800 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:43.011651 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.011597 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:43.025016 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:43.024992 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:43.125533 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:43.125511 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-83.ec2.internal\" not found" Apr 16 14:52:43.213805 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.213780 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:43.215885 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.215700 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal" Apr 16 14:52:43.227449 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.227428 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:43.228454 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.228433 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-83.ec2.internal" Apr 16 14:52:43.238051 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.237979 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:43.696501 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.696475 2568 apiserver.go:52] "Watching apiserver" Apr 16 14:52:43.702559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.702536 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:52:43.703834 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.703811 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-cbb9d","openshift-ovn-kubernetes/ovnkube-node-bff4l","openshift-cluster-node-tuning-operator/tuned-bp89w","openshift-dns/node-resolver-s956z","openshift-image-registry/node-ca-tp4rc","openshift-multus/multus-additional-cni-plugins-5ch67","openshift-multus/network-metrics-daemon-twkfq","kube-system/konnectivity-agent-cvndk","kube-system/kube-apiserver-proxy-ip-10-0-140-83.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal","openshift-multus/multus-jscc7","openshift-network-diagnostics/network-check-target-wr4kj"] Apr 16 14:52:43.705797 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.705775 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.707269 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.707244 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.708051 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.708032 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:52:43.708297 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.708279 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:52:43.708473 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.708447 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:52:43.708547 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.708531 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qk77r\"" Apr 16 14:52:43.708623 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.708607 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tp4rc" Apr 16 14:52:43.709517 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.709444 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:52:43.709517 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.709513 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:52:43.709781 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.709735 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:52:43.709881 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.709835 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:52:43.709996 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.709884 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:52:43.709996 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.709984 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:52:43.710260 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.710241 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nvq22\"" Apr 16 14:52:43.710748 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.710709 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:52:43.710985 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.710965 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:52:43.710985 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.710976 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:52:43.711130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.711009 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-t8kzr\"" Apr 16 14:52:43.711461 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.711445 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:43.711571 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:43.711548 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:52:43.711635 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.711600 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cbb9d" Apr 16 14:52:43.713054 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.713035 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.714022 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.713942 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:52:43.714022 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.713996 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:43.714022 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.714011 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:43.714205 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.714003 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-h65jw\"" Apr 16 14:52:43.714617 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.714600 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:43.714745 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:43.714718 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:52:43.715413 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.715396 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:52:43.715525 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.715507 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-lcdvs\"" Apr 16 14:52:43.716118 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.715749 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:52:43.716118 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.715794 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:52:43.716118 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.716110 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:52:43.717044 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.717025 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.719397 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.719377 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-x74kr\"" Apr 16 14:52:43.719673 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.719656 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:43.720126 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.719848 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:43.720372 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.720356 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s956z" Apr 16 14:52:43.722631 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.722616 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:52:43.722733 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.722668 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nqc6k\"" Apr 16 14:52:43.722880 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.722863 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:52:43.724337 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.724318 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.725657 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725637 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cvndk" Apr 16 14:52:43.725657 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725646 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.725821 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725680 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxk5n\" (UniqueName: \"kubernetes.io/projected/927adbe7-9da4-4d4c-9bd5-3f36e9ef8978-kube-api-access-lxk5n\") pod \"node-ca-tp4rc\" (UID: \"927adbe7-9da4-4d4c-9bd5-3f36e9ef8978\") " pod="openshift-image-registry/node-ca-tp4rc" Apr 16 14:52:43.725821 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725708 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-multus-cni-dir\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.725821 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725740 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-run-ovn\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.725821 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725759 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-cni-bin\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.725821 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725777 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-ovnkube-script-lib\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.725821 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725791 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-os-release\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.725821 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725805 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-etc-kubernetes\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.725821 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725820 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-systemd-units\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725834 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-system-cni-dir\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725849 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-run-k8s-cni-cncf-io\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725866 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-hostroot\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725879 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-registration-dir\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725911 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-var-lib-kubelet\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725927 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvlrv\" (UniqueName: \"kubernetes.io/projected/b932e53d-9993-47b8-a2cb-940fc759370d-kube-api-access-nvlrv\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725940 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-log-socket\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725958 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-run-ovn-kubernetes\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725981 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-systemd\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.725995 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/927adbe7-9da4-4d4c-9bd5-3f36e9ef8978-serviceca\") pod \"node-ca-tp4rc\" (UID: \"927adbe7-9da4-4d4c-9bd5-3f36e9ef8978\") " pod="openshift-image-registry/node-ca-tp4rc" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726009 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-var-lib-cni-bin\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726022 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-multus-daemon-config\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726043 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-socket-dir\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726065 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-etc-selinux\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726090 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-run-systemd\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.726237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-sysctl-d\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726161 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-kubelet\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726200 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/927adbe7-9da4-4d4c-9bd5-3f36e9ef8978-host\") pod \"node-ca-tp4rc\" (UID: \"927adbe7-9da4-4d4c-9bd5-3f36e9ef8978\") " pod="openshift-image-registry/node-ca-tp4rc" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726246 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726308 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-host\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726338 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-var-lib-cni-multus\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726362 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-ovnkube-config\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726385 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-ovn-node-metrics-cert\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726435 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-sysctl-conf\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726486 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-multus-socket-dir-parent\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726500 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wdc67\"" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726493 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726517 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-node-log\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726554 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726591 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fc0480c-be78-4a13-8001-9c955eec95e1-host-slash\") pod \"iptables-alerter-cbb9d\" (UID: \"4fc0480c-be78-4a13-8001-9c955eec95e1\") " pod="openshift-network-operator/iptables-alerter-cbb9d" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726632 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-kubernetes\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726659 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726675 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-run-openvswitch\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.727029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726704 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-sysconfig\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726745 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-cni-binary-copy\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726766 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-var-lib-kubelet\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726812 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-multus-conf-dir\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726859 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-sys-fs\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726884 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-run-netns\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726923 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-etc-openvswitch\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726946 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-sys\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726968 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-lib-modules\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.726992 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8w2c\" (UniqueName: \"kubernetes.io/projected/4fc0480c-be78-4a13-8001-9c955eec95e1-kube-api-access-g8w2c\") pod \"iptables-alerter-cbb9d\" (UID: \"4fc0480c-be78-4a13-8001-9c955eec95e1\") " pod="openshift-network-operator/iptables-alerter-cbb9d" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727017 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-device-dir\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727040 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl5lz\" (UniqueName: \"kubernetes.io/projected/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-kube-api-access-vl5lz\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727062 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-var-lib-openvswitch\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727098 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94s6x\" (UniqueName: \"kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x\") pod \"network-check-target-wr4kj\" (UID: \"d293eadf-7dbb-4770-b547-d28be07dbdf1\") " pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727127 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-modprobe-d\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727161 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/04361303-4b68-4a7e-b73e-a4329bb6bb65-tmp\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.727832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727183 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2gfs\" (UniqueName: \"kubernetes.io/projected/04361303-4b68-4a7e-b73e-a4329bb6bb65-kube-api-access-v2gfs\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727222 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-cnibin\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727252 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-slash\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727303 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-cni-netd\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727334 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-env-overrides\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727357 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-run\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727380 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-tuned\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727403 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-run-netns\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727427 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5tlp\" (UniqueName: \"kubernetes.io/projected/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-kube-api-access-v5tlp\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727457 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4fc0480c-be78-4a13-8001-9c955eec95e1-iptables-alerter-script\") pod \"iptables-alerter-cbb9d\" (UID: \"4fc0480c-be78-4a13-8001-9c955eec95e1\") " pod="openshift-network-operator/iptables-alerter-cbb9d" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727517 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-run-multus-certs\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727539 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c77t\" (UniqueName: \"kubernetes.io/projected/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-kube-api-access-7c77t\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.727909 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.728054 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:52:43.728559 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.728110 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5ds8c\"" Apr 16 14:52:43.768501 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.768476 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:42 +0000 UTC" deadline="2027-12-21 11:40:26.675095606 +0000 UTC" Apr 16 14:52:43.768501 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.768498 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14732h47m42.906599813s" Apr 16 14:52:43.812085 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.812068 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:43.816339 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.816322 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:52:43.827951 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.827931 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-run-openvswitch\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.828034 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.827962 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-sysconfig\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.828034 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828000 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-cni-binary-copy\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.828034 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828027 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-var-lib-kubelet\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.828196 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828030 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-run-openvswitch\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.828196 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828052 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-multus-conf-dir\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.828196 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828075 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-sysconfig\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.828196 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828086 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-var-lib-kubelet\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.828196 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828088 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f1f8e071-c6dd-48e5-b5d3-e096d8a24e92-tmp-dir\") pod \"node-resolver-s956z\" (UID: \"f1f8e071-c6dd-48e5-b5d3-e096d8a24e92\") " pod="openshift-dns/node-resolver-s956z" Apr 16 14:52:43.828196 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828130 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f6d1b1db-1136-4814-bd98-1f4c0d57bd3a-konnectivity-ca\") pod \"konnectivity-agent-cvndk\" (UID: \"f6d1b1db-1136-4814-bd98-1f4c0d57bd3a\") " pod="kube-system/konnectivity-agent-cvndk" Apr 16 14:52:43.828196 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828127 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-multus-conf-dir\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.828196 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828172 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-sys-fs\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828201 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-sys-fs\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828198 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-run-netns\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828245 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-etc-openvswitch\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828269 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-sys\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828294 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-lib-modules\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828318 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-sys\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828323 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-etc-openvswitch\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828319 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8w2c\" (UniqueName: \"kubernetes.io/projected/4fc0480c-be78-4a13-8001-9c955eec95e1-kube-api-access-g8w2c\") pod \"iptables-alerter-cbb9d\" (UID: \"4fc0480c-be78-4a13-8001-9c955eec95e1\") " pod="openshift-network-operator/iptables-alerter-cbb9d" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828364 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4148d88d-fea6-4539-8c24-81aff5a953f7-os-release\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828362 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-run-netns\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828395 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-device-dir\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828417 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vl5lz\" (UniqueName: \"kubernetes.io/projected/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-kube-api-access-vl5lz\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828433 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-var-lib-openvswitch\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828445 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-lib-modules\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828452 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94s6x\" (UniqueName: \"kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x\") pod \"network-check-target-wr4kj\" (UID: \"d293eadf-7dbb-4770-b547-d28be07dbdf1\") " pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828475 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-device-dir\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.828548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828474 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-cni-binary-copy\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.829398 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828479 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-modprobe-d\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.829398 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828494 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-var-lib-openvswitch\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.829398 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828548 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/04361303-4b68-4a7e-b73e-a4329bb6bb65-tmp\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.829398 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828568 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-modprobe-d\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.829398 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828577 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2gfs\" (UniqueName: \"kubernetes.io/projected/04361303-4b68-4a7e-b73e-a4329bb6bb65-kube-api-access-v2gfs\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.829398 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828603 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-cnibin\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.829398 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828626 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-slash\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.829398 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828649 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-cni-netd\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.829398 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828674 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-env-overrides\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.829398 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828683 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-cnibin\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.829398 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828743 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-cni-netd\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.829867 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.829495 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:52:43.829867 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.828702 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-run\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.829867 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.829586 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-tuned\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.829867 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.829625 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-run-netns\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.829867 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.829662 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5tlp\" (UniqueName: \"kubernetes.io/projected/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-kube-api-access-v5tlp\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.829867 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.829705 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4fc0480c-be78-4a13-8001-9c955eec95e1-iptables-alerter-script\") pod \"iptables-alerter-cbb9d\" (UID: \"4fc0480c-be78-4a13-8001-9c955eec95e1\") " pod="openshift-network-operator/iptables-alerter-cbb9d" Apr 16 14:52:43.829867 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.829764 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-run-multus-certs\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.829867 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.829804 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c77t\" (UniqueName: \"kubernetes.io/projected/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-kube-api-access-7c77t\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.829867 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.829839 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f6d1b1db-1136-4814-bd98-1f4c0d57bd3a-agent-certs\") pod \"konnectivity-agent-cvndk\" (UID: \"f6d1b1db-1136-4814-bd98-1f4c0d57bd3a\") " pod="kube-system/konnectivity-agent-cvndk" Apr 16 14:52:43.830260 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.829875 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4148d88d-fea6-4539-8c24-81aff5a953f7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.830260 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.829928 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.830260 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.829992 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxk5n\" (UniqueName: \"kubernetes.io/projected/927adbe7-9da4-4d4c-9bd5-3f36e9ef8978-kube-api-access-lxk5n\") pod \"node-ca-tp4rc\" (UID: \"927adbe7-9da4-4d4c-9bd5-3f36e9ef8978\") " pod="openshift-image-registry/node-ca-tp4rc" Apr 16 14:52:43.830260 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830015 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-env-overrides\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.830260 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830025 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-multus-cni-dir\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.830260 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830053 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-run-ovn\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.830260 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830088 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-cni-bin\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.830260 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830096 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-run-netns\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.830260 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830173 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-run\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.830649 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830121 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-ovnkube-script-lib\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.830649 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830329 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-os-release\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.830649 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830361 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-etc-kubernetes\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.830649 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830405 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4148d88d-fea6-4539-8c24-81aff5a953f7-cnibin\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.830649 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830436 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4148d88d-fea6-4539-8c24-81aff5a953f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.830649 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830459 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-run-multus-certs\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.830649 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830476 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4148d88d-fea6-4539-8c24-81aff5a953f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.830649 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830513 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-systemd-units\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.830649 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830546 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-system-cni-dir\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.830649 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830577 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-run-k8s-cni-cncf-io\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.830649 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830612 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-hostroot\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830645 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f1f8e071-c6dd-48e5-b5d3-e096d8a24e92-hosts-file\") pod \"node-resolver-s956z\" (UID: \"f1f8e071-c6dd-48e5-b5d3-e096d8a24e92\") " pod="openshift-dns/node-resolver-s956z" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830688 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58nkt\" (UniqueName: \"kubernetes.io/projected/f1f8e071-c6dd-48e5-b5d3-e096d8a24e92-kube-api-access-58nkt\") pod \"node-resolver-s956z\" (UID: \"f1f8e071-c6dd-48e5-b5d3-e096d8a24e92\") " pod="openshift-dns/node-resolver-s956z" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830729 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-registration-dir\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830680 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4fc0480c-be78-4a13-8001-9c955eec95e1-iptables-alerter-script\") pod \"iptables-alerter-cbb9d\" (UID: \"4fc0480c-be78-4a13-8001-9c955eec95e1\") " pod="openshift-network-operator/iptables-alerter-cbb9d" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830792 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-os-release\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830763 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-var-lib-kubelet\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830842 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvlrv\" (UniqueName: \"kubernetes.io/projected/b932e53d-9993-47b8-a2cb-940fc759370d-kube-api-access-nvlrv\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830874 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-log-socket\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830886 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-multus-cni-dir\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830924 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-run-ovn-kubernetes\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830929 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-run-ovn\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830959 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-systemd\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.830997 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-cni-bin\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831039 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/927adbe7-9da4-4d4c-9bd5-3f36e9ef8978-serviceca\") pod \"node-ca-tp4rc\" (UID: \"927adbe7-9da4-4d4c-9bd5-3f36e9ef8978\") " pod="openshift-image-registry/node-ca-tp4rc" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831072 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-var-lib-cni-bin\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831105 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-multus-daemon-config\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.831103 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831107 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-systemd\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.831743 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831136 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-socket-dir\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.831743 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831174 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.831743 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831200 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-etc-kubernetes\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.831743 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831167 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-etc-selinux\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.831743 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831238 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-run-systemd\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.831743 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831240 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-slash\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.831743 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831272 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-sysctl-d\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.832059 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831751 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-kubelet\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.832059 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831787 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/927adbe7-9da4-4d4c-9bd5-3f36e9ef8978-host\") pod \"node-ca-tp4rc\" (UID: \"927adbe7-9da4-4d4c-9bd5-3f36e9ef8978\") " pod="openshift-image-registry/node-ca-tp4rc" Apr 16 14:52:43.832059 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831821 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.832059 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831832 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-system-cni-dir\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.832059 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831853 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-host\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.832059 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831882 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-run-k8s-cni-cncf-io\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.832059 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831909 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-var-lib-cni-multus\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.832059 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831943 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4148d88d-fea6-4539-8c24-81aff5a953f7-system-cni-dir\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.832059 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.831984 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-ovnkube-config\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.832059 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832010 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-ovn-node-metrics-cert\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.832059 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-log-socket\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.832059 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832038 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-hostroot\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.832059 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832049 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-sysctl-conf\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.832547 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832266 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-var-lib-cni-multus\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.832547 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832278 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-run-systemd\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.832547 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832323 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-registration-dir\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.832547 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832326 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-kubelet\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.832547 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832331 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-var-lib-kubelet\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.832547 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832450 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-sysctl-conf\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.832862 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832842 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-ovnkube-script-lib\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.832927 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832863 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-multus-daemon-config\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.832975 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832922 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-multus-socket-dir-parent\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.832975 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.832928 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-host\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.833052 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833019 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-sysctl-d\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.833052 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833038 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-host-var-lib-cni-bin\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.833136 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833092 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-etc-selinux\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.833181 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833131 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-host-run-ovn-kubernetes\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.833181 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833166 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4148d88d-fea6-4539-8c24-81aff5a953f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.833263 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833201 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-node-log\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.833263 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833203 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/04361303-4b68-4a7e-b73e-a4329bb6bb65-tmp\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.833350 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833277 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.833350 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833282 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/927adbe7-9da4-4d4c-9bd5-3f36e9ef8978-host\") pod \"node-ca-tp4rc\" (UID: \"927adbe7-9da4-4d4c-9bd5-3f36e9ef8978\") " pod="openshift-image-registry/node-ca-tp4rc" Apr 16 14:52:43.833350 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833314 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:43.833471 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833372 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fc0480c-be78-4a13-8001-9c955eec95e1-host-slash\") pod \"iptables-alerter-cbb9d\" (UID: \"4fc0480c-be78-4a13-8001-9c955eec95e1\") " pod="openshift-network-operator/iptables-alerter-cbb9d" Apr 16 14:52:43.833471 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833410 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nq9w\" (UniqueName: \"kubernetes.io/projected/4148d88d-fea6-4539-8c24-81aff5a953f7-kube-api-access-6nq9w\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.833471 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833445 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-kubernetes\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.833600 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833562 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-kubernetes\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.833643 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833591 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/927adbe7-9da4-4d4c-9bd5-3f36e9ef8978-serviceca\") pod \"node-ca-tp4rc\" (UID: \"927adbe7-9da4-4d4c-9bd5-3f36e9ef8978\") " pod="openshift-image-registry/node-ca-tp4rc" Apr 16 14:52:43.833643 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833636 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-systemd-units\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.833740 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:43.833657 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:43.833740 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833677 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fc0480c-be78-4a13-8001-9c955eec95e1-host-slash\") pod \"iptables-alerter-cbb9d\" (UID: \"4fc0480c-be78-4a13-8001-9c955eec95e1\") " pod="openshift-network-operator/iptables-alerter-cbb9d" Apr 16 14:52:43.833740 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.833715 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-node-log\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.833740 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:43.833733 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs podName:b932e53d-9993-47b8-a2cb-940fc759370d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:44.333697097 +0000 UTC m=+3.097649209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs") pod "network-metrics-daemon-twkfq" (UID: "b932e53d-9993-47b8-a2cb-940fc759370d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:43.835236 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.834035 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-multus-socket-dir-parent\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.835236 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.834113 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-socket-dir\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.835236 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.834189 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/04361303-4b68-4a7e-b73e-a4329bb6bb65-etc-tuned\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.835678 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.835655 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-ovnkube-config\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.835858 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.835840 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-ovn-node-metrics-cert\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.838641 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:43.838616 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:43.838641 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:43.838640 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:43.838771 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:43.838654 2568 projected.go:194] Error preparing data for projected volume kube-api-access-94s6x for pod openshift-network-diagnostics/network-check-target-wr4kj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:43.838771 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:43.838716 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x podName:d293eadf-7dbb-4770-b547-d28be07dbdf1 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:44.338700719 +0000 UTC m=+3.102652830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-94s6x" (UniqueName: "kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x") pod "network-check-target-wr4kj" (UID: "d293eadf-7dbb-4770-b547-d28be07dbdf1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:43.839983 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.839920 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl5lz\" (UniqueName: \"kubernetes.io/projected/0d3f1567-ac34-4e7c-a3aa-732f78d80a79-kube-api-access-vl5lz\") pod \"aws-ebs-csi-driver-node-c98zb\" (UID: \"0d3f1567-ac34-4e7c-a3aa-732f78d80a79\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:43.840745 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.840714 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2gfs\" (UniqueName: \"kubernetes.io/projected/04361303-4b68-4a7e-b73e-a4329bb6bb65-kube-api-access-v2gfs\") pod \"tuned-bp89w\" (UID: \"04361303-4b68-4a7e-b73e-a4329bb6bb65\") " pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:43.840831 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.840759 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8w2c\" (UniqueName: \"kubernetes.io/projected/4fc0480c-be78-4a13-8001-9c955eec95e1-kube-api-access-g8w2c\") pod \"iptables-alerter-cbb9d\" (UID: \"4fc0480c-be78-4a13-8001-9c955eec95e1\") " pod="openshift-network-operator/iptables-alerter-cbb9d" Apr 16 14:52:43.842217 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.842198 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c77t\" (UniqueName: \"kubernetes.io/projected/dd8296f9-c225-4ca9-8f12-e3aa03b02f50-kube-api-access-7c77t\") pod \"multus-jscc7\" (UID: \"dd8296f9-c225-4ca9-8f12-e3aa03b02f50\") " pod="openshift-multus/multus-jscc7" Apr 16 14:52:43.845134 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.845108 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvlrv\" (UniqueName: \"kubernetes.io/projected/b932e53d-9993-47b8-a2cb-940fc759370d-kube-api-access-nvlrv\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:43.845347 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.845329 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxk5n\" (UniqueName: \"kubernetes.io/projected/927adbe7-9da4-4d4c-9bd5-3f36e9ef8978-kube-api-access-lxk5n\") pod \"node-ca-tp4rc\" (UID: \"927adbe7-9da4-4d4c-9bd5-3f36e9ef8978\") " pod="openshift-image-registry/node-ca-tp4rc" Apr 16 14:52:43.846445 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.846426 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5tlp\" (UniqueName: \"kubernetes.io/projected/4e495dc8-7e20-4024-b33a-c9b6c6c1291f-kube-api-access-v5tlp\") pod \"ovnkube-node-bff4l\" (UID: \"4e495dc8-7e20-4024-b33a-c9b6c6c1291f\") " pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:43.934452 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.934430 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f6d1b1db-1136-4814-bd98-1f4c0d57bd3a-agent-certs\") pod \"konnectivity-agent-cvndk\" (UID: \"f6d1b1db-1136-4814-bd98-1f4c0d57bd3a\") " pod="kube-system/konnectivity-agent-cvndk" Apr 16 14:52:43.934583 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.934466 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4148d88d-fea6-4539-8c24-81aff5a953f7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.934752 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.934723 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4148d88d-fea6-4539-8c24-81aff5a953f7-cnibin\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.934861 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.934766 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4148d88d-fea6-4539-8c24-81aff5a953f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.934861 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.934835 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4148d88d-fea6-4539-8c24-81aff5a953f7-cnibin\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.934991 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.934908 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4148d88d-fea6-4539-8c24-81aff5a953f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.934991 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.934950 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f1f8e071-c6dd-48e5-b5d3-e096d8a24e92-hosts-file\") pod \"node-resolver-s956z\" (UID: \"f1f8e071-c6dd-48e5-b5d3-e096d8a24e92\") " pod="openshift-dns/node-resolver-s956z" Apr 16 14:52:43.934991 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.934976 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58nkt\" (UniqueName: \"kubernetes.io/projected/f1f8e071-c6dd-48e5-b5d3-e096d8a24e92-kube-api-access-58nkt\") pod \"node-resolver-s956z\" (UID: \"f1f8e071-c6dd-48e5-b5d3-e096d8a24e92\") " pod="openshift-dns/node-resolver-s956z" Apr 16 14:52:43.935159 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935020 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4148d88d-fea6-4539-8c24-81aff5a953f7-system-cni-dir\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.935159 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935056 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4148d88d-fea6-4539-8c24-81aff5a953f7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.935159 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935071 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4148d88d-fea6-4539-8c24-81aff5a953f7-system-cni-dir\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.935159 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935100 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4148d88d-fea6-4539-8c24-81aff5a953f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.935159 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935107 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f1f8e071-c6dd-48e5-b5d3-e096d8a24e92-hosts-file\") pod \"node-resolver-s956z\" (UID: \"f1f8e071-c6dd-48e5-b5d3-e096d8a24e92\") " pod="openshift-dns/node-resolver-s956z" Apr 16 14:52:43.935159 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935140 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nq9w\" (UniqueName: \"kubernetes.io/projected/4148d88d-fea6-4539-8c24-81aff5a953f7-kube-api-access-6nq9w\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.935475 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935168 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f1f8e071-c6dd-48e5-b5d3-e096d8a24e92-tmp-dir\") pod \"node-resolver-s956z\" (UID: \"f1f8e071-c6dd-48e5-b5d3-e096d8a24e92\") " pod="openshift-dns/node-resolver-s956z" Apr 16 14:52:43.935475 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935191 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f6d1b1db-1136-4814-bd98-1f4c0d57bd3a-konnectivity-ca\") pod \"konnectivity-agent-cvndk\" (UID: \"f6d1b1db-1136-4814-bd98-1f4c0d57bd3a\") " pod="kube-system/konnectivity-agent-cvndk" Apr 16 14:52:43.935475 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935221 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4148d88d-fea6-4539-8c24-81aff5a953f7-os-release\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.935475 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935238 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4148d88d-fea6-4539-8c24-81aff5a953f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.935475 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935320 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4148d88d-fea6-4539-8c24-81aff5a953f7-os-release\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.935693 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935520 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f1f8e071-c6dd-48e5-b5d3-e096d8a24e92-tmp-dir\") pod \"node-resolver-s956z\" (UID: \"f1f8e071-c6dd-48e5-b5d3-e096d8a24e92\") " pod="openshift-dns/node-resolver-s956z" Apr 16 14:52:43.935693 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935537 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4148d88d-fea6-4539-8c24-81aff5a953f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.935693 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935684 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f6d1b1db-1136-4814-bd98-1f4c0d57bd3a-konnectivity-ca\") pod \"konnectivity-agent-cvndk\" (UID: \"f6d1b1db-1136-4814-bd98-1f4c0d57bd3a\") " pod="kube-system/konnectivity-agent-cvndk" Apr 16 14:52:43.935822 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.935725 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4148d88d-fea6-4539-8c24-81aff5a953f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:43.937431 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.937410 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f6d1b1db-1136-4814-bd98-1f4c0d57bd3a-agent-certs\") pod \"konnectivity-agent-cvndk\" (UID: \"f6d1b1db-1136-4814-bd98-1f4c0d57bd3a\") " pod="kube-system/konnectivity-agent-cvndk" Apr 16 14:52:43.942695 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.942673 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58nkt\" (UniqueName: \"kubernetes.io/projected/f1f8e071-c6dd-48e5-b5d3-e096d8a24e92-kube-api-access-58nkt\") pod \"node-resolver-s956z\" (UID: \"f1f8e071-c6dd-48e5-b5d3-e096d8a24e92\") " pod="openshift-dns/node-resolver-s956z" Apr 16 14:52:43.942861 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:43.942841 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nq9w\" (UniqueName: \"kubernetes.io/projected/4148d88d-fea6-4539-8c24-81aff5a953f7-kube-api-access-6nq9w\") pod \"multus-additional-cni-plugins-5ch67\" (UID: \"4148d88d-fea6-4539-8c24-81aff5a953f7\") " pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:44.017847 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.017802 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" Apr 16 14:52:44.027478 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.027458 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:52:44.034042 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.034023 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tp4rc" Apr 16 14:52:44.041398 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.041381 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cbb9d" Apr 16 14:52:44.046969 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.046949 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jscc7" Apr 16 14:52:44.053497 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.053481 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bp89w" Apr 16 14:52:44.058958 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.058941 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s956z" Apr 16 14:52:44.065413 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.065393 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5ch67" Apr 16 14:52:44.070948 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.070931 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cvndk" Apr 16 14:52:44.154348 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.154319 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:44.337859 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.337826 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:44.338031 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:44.337995 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:44.338102 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:44.338070 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs podName:b932e53d-9993-47b8-a2cb-940fc759370d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:45.338050172 +0000 UTC m=+4.102002274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs") pod "network-metrics-daemon-twkfq" (UID: "b932e53d-9993-47b8-a2cb-940fc759370d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:44.379329 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:44.379307 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod927adbe7_9da4_4d4c_9bd5_3f36e9ef8978.slice/crio-5682be5952fec565d1810d5409206941f7c010fcabc75cf9de15cbc88de491a4 WatchSource:0}: Error finding container 5682be5952fec565d1810d5409206941f7c010fcabc75cf9de15cbc88de491a4: Status 404 returned error can't find the container with id 5682be5952fec565d1810d5409206941f7c010fcabc75cf9de15cbc88de491a4 Apr 16 14:52:44.380167 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:44.380140 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd8296f9_c225_4ca9_8f12_e3aa03b02f50.slice/crio-a3d2eac2504b4d1dd2fb63eb74c0835f9256d5e112bde1ea8446ddb0f4af70c2 WatchSource:0}: Error finding container a3d2eac2504b4d1dd2fb63eb74c0835f9256d5e112bde1ea8446ddb0f4af70c2: Status 404 returned error can't find the container with id a3d2eac2504b4d1dd2fb63eb74c0835f9256d5e112bde1ea8446ddb0f4af70c2 Apr 16 14:52:44.381310 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:44.381240 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e495dc8_7e20_4024_b33a_c9b6c6c1291f.slice/crio-649e00a322d4ad5649e177d73ca3e39e107ab78c668e4432d5f05d566d806d25 WatchSource:0}: Error finding container 649e00a322d4ad5649e177d73ca3e39e107ab78c668e4432d5f05d566d806d25: Status 404 returned error can't find the container with id 649e00a322d4ad5649e177d73ca3e39e107ab78c668e4432d5f05d566d806d25 Apr 16 14:52:44.384458 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:44.384437 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6d1b1db_1136_4814_bd98_1f4c0d57bd3a.slice/crio-35c65d5bb69218fdb7ba12079e1dd4a35b8688214fdb9606514d1469cac0afb4 WatchSource:0}: Error finding container 35c65d5bb69218fdb7ba12079e1dd4a35b8688214fdb9606514d1469cac0afb4: Status 404 returned error can't find the container with id 35c65d5bb69218fdb7ba12079e1dd4a35b8688214fdb9606514d1469cac0afb4 Apr 16 14:52:44.386375 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:44.385767 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4148d88d_fea6_4539_8c24_81aff5a953f7.slice/crio-113e693269417df5c7325e932e6c6f1d62eccd3e043bf0c47591be9d866dcd89 WatchSource:0}: Error finding container 113e693269417df5c7325e932e6c6f1d62eccd3e043bf0c47591be9d866dcd89: Status 404 returned error can't find the container with id 113e693269417df5c7325e932e6c6f1d62eccd3e043bf0c47591be9d866dcd89 Apr 16 14:52:44.386751 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:44.386717 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d3f1567_ac34_4e7c_a3aa_732f78d80a79.slice/crio-a6caad5faf15a8687172d60179abf43f4ce73bf5279eec4c611c30b0bcf57c0e WatchSource:0}: Error finding container a6caad5faf15a8687172d60179abf43f4ce73bf5279eec4c611c30b0bcf57c0e: Status 404 returned error can't find the container with id a6caad5faf15a8687172d60179abf43f4ce73bf5279eec4c611c30b0bcf57c0e Apr 16 14:52:44.388109 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:44.388084 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1f8e071_c6dd_48e5_b5d3_e096d8a24e92.slice/crio-ebba7278325c1caba6eb0437780443f23b63f3de4b46a5fc2ab53738e204c67a WatchSource:0}: Error finding container ebba7278325c1caba6eb0437780443f23b63f3de4b46a5fc2ab53738e204c67a: Status 404 returned error can't find the container with id ebba7278325c1caba6eb0437780443f23b63f3de4b46a5fc2ab53738e204c67a Apr 16 14:52:44.389224 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:44.389201 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04361303_4b68_4a7e_b73e_a4329bb6bb65.slice/crio-5eefb2f0cc62c21ad7fff4ce58bb4b86e3df0aeeafcffe58863dda5b3be18d79 WatchSource:0}: Error finding container 5eefb2f0cc62c21ad7fff4ce58bb4b86e3df0aeeafcffe58863dda5b3be18d79: Status 404 returned error can't find the container with id 5eefb2f0cc62c21ad7fff4ce58bb4b86e3df0aeeafcffe58863dda5b3be18d79 Apr 16 14:52:44.391045 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:52:44.390538 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fc0480c_be78_4a13_8001_9c955eec95e1.slice/crio-9a0ce77f2e2b76ba1d93274720149a9e7fa6d31a3b74fd0b526355b36cad5140 WatchSource:0}: Error finding container 9a0ce77f2e2b76ba1d93274720149a9e7fa6d31a3b74fd0b526355b36cad5140: Status 404 returned error can't find the container with id 9a0ce77f2e2b76ba1d93274720149a9e7fa6d31a3b74fd0b526355b36cad5140 Apr 16 14:52:44.438234 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.438124 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94s6x\" (UniqueName: \"kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x\") pod \"network-check-target-wr4kj\" (UID: \"d293eadf-7dbb-4770-b547-d28be07dbdf1\") " pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:44.438333 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:44.438281 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:44.438333 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:44.438303 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:44.438333 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:44.438316 2568 projected.go:194] Error preparing data for projected volume kube-api-access-94s6x for pod openshift-network-diagnostics/network-check-target-wr4kj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:44.438491 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:44.438368 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x podName:d293eadf-7dbb-4770-b547-d28be07dbdf1 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:45.438348918 +0000 UTC m=+4.202301025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-94s6x" (UniqueName: "kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x") pod "network-check-target-wr4kj" (UID: "d293eadf-7dbb-4770-b547-d28be07dbdf1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:44.769038 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.768944 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:42 +0000 UTC" deadline="2027-12-24 04:52:35.637051594 +0000 UTC" Apr 16 14:52:44.769038 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.768973 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14797h59m50.868081941s" Apr 16 14:52:44.860860 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.860822 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:44.861032 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:44.860950 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:52:44.868870 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.868354 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-83.ec2.internal" event={"ID":"66ca7287f809a0a0d2312e14abcec99e","Type":"ContainerStarted","Data":"539f0dce5e6bd37f68516621be7992e8d5e5a32008ebd7ff693ac9e732bfec19"} Apr 16 14:52:44.870630 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.870582 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cbb9d" event={"ID":"4fc0480c-be78-4a13-8001-9c955eec95e1","Type":"ContainerStarted","Data":"9a0ce77f2e2b76ba1d93274720149a9e7fa6d31a3b74fd0b526355b36cad5140"} Apr 16 14:52:44.875676 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.873758 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bp89w" event={"ID":"04361303-4b68-4a7e-b73e-a4329bb6bb65","Type":"ContainerStarted","Data":"5eefb2f0cc62c21ad7fff4ce58bb4b86e3df0aeeafcffe58863dda5b3be18d79"} Apr 16 14:52:44.878547 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.878522 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s956z" event={"ID":"f1f8e071-c6dd-48e5-b5d3-e096d8a24e92","Type":"ContainerStarted","Data":"ebba7278325c1caba6eb0437780443f23b63f3de4b46a5fc2ab53738e204c67a"} Apr 16 14:52:44.882456 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.882236 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-83.ec2.internal" podStartSLOduration=1.8822195800000001 podStartE2EDuration="1.88221958s" podCreationTimestamp="2026-04-16 14:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:44.881718983 +0000 UTC m=+3.645671095" watchObservedRunningTime="2026-04-16 14:52:44.88221958 +0000 UTC m=+3.646171692" Apr 16 14:52:44.883631 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.883610 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ch67" event={"ID":"4148d88d-fea6-4539-8c24-81aff5a953f7","Type":"ContainerStarted","Data":"113e693269417df5c7325e932e6c6f1d62eccd3e043bf0c47591be9d866dcd89"} Apr 16 14:52:44.887186 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.887143 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" event={"ID":"4e495dc8-7e20-4024-b33a-c9b6c6c1291f","Type":"ContainerStarted","Data":"649e00a322d4ad5649e177d73ca3e39e107ab78c668e4432d5f05d566d806d25"} Apr 16 14:52:44.892938 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.892909 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jscc7" event={"ID":"dd8296f9-c225-4ca9-8f12-e3aa03b02f50","Type":"ContainerStarted","Data":"a3d2eac2504b4d1dd2fb63eb74c0835f9256d5e112bde1ea8446ddb0f4af70c2"} Apr 16 14:52:44.902741 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.902717 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" event={"ID":"0d3f1567-ac34-4e7c-a3aa-732f78d80a79","Type":"ContainerStarted","Data":"a6caad5faf15a8687172d60179abf43f4ce73bf5279eec4c611c30b0bcf57c0e"} Apr 16 14:52:44.904750 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.904729 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cvndk" event={"ID":"f6d1b1db-1136-4814-bd98-1f4c0d57bd3a","Type":"ContainerStarted","Data":"35c65d5bb69218fdb7ba12079e1dd4a35b8688214fdb9606514d1469cac0afb4"} Apr 16 14:52:44.907234 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:44.907212 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tp4rc" event={"ID":"927adbe7-9da4-4d4c-9bd5-3f36e9ef8978","Type":"ContainerStarted","Data":"5682be5952fec565d1810d5409206941f7c010fcabc75cf9de15cbc88de491a4"} Apr 16 14:52:45.343936 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:45.343514 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:45.343936 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:45.343702 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:45.343936 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:45.343779 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs podName:b932e53d-9993-47b8-a2cb-940fc759370d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:47.343749115 +0000 UTC m=+6.107701207 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs") pod "network-metrics-daemon-twkfq" (UID: "b932e53d-9993-47b8-a2cb-940fc759370d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:45.444177 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:45.444146 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94s6x\" (UniqueName: \"kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x\") pod \"network-check-target-wr4kj\" (UID: \"d293eadf-7dbb-4770-b547-d28be07dbdf1\") " pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:45.444323 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:45.444307 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:45.444393 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:45.444331 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:45.444393 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:45.444351 2568 projected.go:194] Error preparing data for projected volume kube-api-access-94s6x for pod openshift-network-diagnostics/network-check-target-wr4kj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:45.444500 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:45.444401 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x podName:d293eadf-7dbb-4770-b547-d28be07dbdf1 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:47.444383441 +0000 UTC m=+6.208335535 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-94s6x" (UniqueName: "kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x") pod "network-check-target-wr4kj" (UID: "d293eadf-7dbb-4770-b547-d28be07dbdf1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:45.859042 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:45.859012 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:45.859509 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:45.859143 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:52:45.922997 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:45.922399 2568 generic.go:358] "Generic (PLEG): container finished" podID="187d013988eca3fa2c04c9f72ca0653f" containerID="9be4942c05e58e53a9a94faa670889703baf3d681f12cc48ce1b1e184231826a" exitCode=0 Apr 16 14:52:45.922997 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:45.922877 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal" event={"ID":"187d013988eca3fa2c04c9f72ca0653f","Type":"ContainerDied","Data":"9be4942c05e58e53a9a94faa670889703baf3d681f12cc48ce1b1e184231826a"} Apr 16 14:52:46.859882 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:46.859420 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:46.859882 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:46.859548 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:52:46.935009 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:46.934977 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal" event={"ID":"187d013988eca3fa2c04c9f72ca0653f","Type":"ContainerStarted","Data":"fab315d323aba10c3854f46a760a7a338713e2c894fd11e87cbb9cf99ade968b"} Apr 16 14:52:47.360382 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:47.360345 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:47.360565 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:47.360511 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:47.360626 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:47.360580 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs podName:b932e53d-9993-47b8-a2cb-940fc759370d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:51.360558946 +0000 UTC m=+10.124511037 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs") pod "network-metrics-daemon-twkfq" (UID: "b932e53d-9993-47b8-a2cb-940fc759370d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:47.461684 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:47.461608 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94s6x\" (UniqueName: \"kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x\") pod \"network-check-target-wr4kj\" (UID: \"d293eadf-7dbb-4770-b547-d28be07dbdf1\") " pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:47.461925 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:47.461796 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:47.461925 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:47.461813 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:47.461925 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:47.461821 2568 projected.go:194] Error preparing data for projected volume kube-api-access-94s6x for pod openshift-network-diagnostics/network-check-target-wr4kj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:47.461925 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:47.461874 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x podName:d293eadf-7dbb-4770-b547-d28be07dbdf1 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:51.461860272 +0000 UTC m=+10.225812364 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-94s6x" (UniqueName: "kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x") pod "network-check-target-wr4kj" (UID: "d293eadf-7dbb-4770-b547-d28be07dbdf1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:47.862359 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:47.862329 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:47.862801 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:47.862464 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:52:48.859218 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:48.858762 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:48.859218 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:48.858884 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:52:49.859577 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:49.859543 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:49.860042 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:49.859682 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:52:50.859462 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:50.859430 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:50.859642 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:50.859553 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:52:51.391334 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:51.391301 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:51.391497 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:51.391424 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:51.391497 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:51.391480 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs podName:b932e53d-9993-47b8-a2cb-940fc759370d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:59.391462543 +0000 UTC m=+18.155414636 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs") pod "network-metrics-daemon-twkfq" (UID: "b932e53d-9993-47b8-a2cb-940fc759370d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:51.492487 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:51.492409 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94s6x\" (UniqueName: \"kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x\") pod \"network-check-target-wr4kj\" (UID: \"d293eadf-7dbb-4770-b547-d28be07dbdf1\") " pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:51.492639 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:51.492619 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:51.492712 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:51.492645 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:51.492712 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:51.492658 2568 projected.go:194] Error preparing data for projected volume kube-api-access-94s6x for pod openshift-network-diagnostics/network-check-target-wr4kj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:51.492817 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:51.492740 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x podName:d293eadf-7dbb-4770-b547-d28be07dbdf1 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:59.492719605 +0000 UTC m=+18.256671695 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-94s6x" (UniqueName: "kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x") pod "network-check-target-wr4kj" (UID: "d293eadf-7dbb-4770-b547-d28be07dbdf1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:51.860619 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:51.860327 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:51.860619 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:51.860477 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:52:52.859121 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:52.859093 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:52.859290 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:52.859211 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:52:53.163350 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:53.163203 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-83.ec2.internal" podStartSLOduration=10.163187676 podStartE2EDuration="10.163187676s" podCreationTimestamp="2026-04-16 14:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:46.94938592 +0000 UTC m=+5.713338032" watchObservedRunningTime="2026-04-16 14:52:53.163187676 +0000 UTC m=+11.927139789" Apr 16 14:52:53.163789 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:53.163538 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-2zgf8"] Apr 16 14:52:53.166252 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:53.166175 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:53.166356 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:53.166250 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zgf8" podUID="38ffecc9-f9c3-4e05-972f-ee25cb7922e6" Apr 16 14:52:53.207056 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:53.207024 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:53.207181 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:53.207065 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-dbus\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:53.207181 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:53.207146 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-kubelet-config\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:53.307534 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:53.307487 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:53.307534 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:53.307538 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-dbus\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:53.307754 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:53.307584 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-kubelet-config\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:53.307754 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:53.307629 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:53.307754 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:53.307714 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret podName:38ffecc9-f9c3-4e05-972f-ee25cb7922e6 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:53.807692637 +0000 UTC m=+12.571644730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret") pod "global-pull-secret-syncer-2zgf8" (UID: "38ffecc9-f9c3-4e05-972f-ee25cb7922e6") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:53.307754 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:53.307711 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-kubelet-config\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:53.307968 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:53.307824 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-dbus\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:53.811522 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:53.811493 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:53.811676 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:53.811596 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:53.811676 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:53.811640 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret podName:38ffecc9-f9c3-4e05-972f-ee25cb7922e6 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:54.811627647 +0000 UTC m=+13.575579736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret") pod "global-pull-secret-syncer-2zgf8" (UID: "38ffecc9-f9c3-4e05-972f-ee25cb7922e6") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:53.859542 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:53.859503 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:53.859668 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:53.859629 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:52:54.819546 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:54.819511 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:54.819922 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:54.819638 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:54.819922 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:54.819709 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret podName:38ffecc9-f9c3-4e05-972f-ee25cb7922e6 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:56.81969033 +0000 UTC m=+15.583642418 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret") pod "global-pull-secret-syncer-2zgf8" (UID: "38ffecc9-f9c3-4e05-972f-ee25cb7922e6") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:54.858819 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:54.858796 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:54.858956 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:54.858800 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:54.858956 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:54.858913 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zgf8" podUID="38ffecc9-f9c3-4e05-972f-ee25cb7922e6" Apr 16 14:52:54.859053 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:54.858997 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:52:55.859259 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:55.859226 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:55.859697 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:55.859370 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:52:56.835563 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:56.835525 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:56.835737 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:56.835658 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:56.835737 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:56.835738 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret podName:38ffecc9-f9c3-4e05-972f-ee25cb7922e6 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:00.83571603 +0000 UTC m=+19.599668135 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret") pod "global-pull-secret-syncer-2zgf8" (UID: "38ffecc9-f9c3-4e05-972f-ee25cb7922e6") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:56.859540 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:56.859485 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:56.859875 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:56.859485 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:56.859875 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:56.859591 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zgf8" podUID="38ffecc9-f9c3-4e05-972f-ee25cb7922e6" Apr 16 14:52:56.859875 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:56.859671 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:52:57.858904 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:57.858864 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:57.859086 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:57.859007 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:52:58.858956 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:58.858927 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:58.859440 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:58.858927 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:52:58.859440 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:58.859047 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:52:58.859440 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:58.859133 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zgf8" podUID="38ffecc9-f9c3-4e05-972f-ee25cb7922e6" Apr 16 14:52:59.457801 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:59.457773 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:59.457976 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:59.457917 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:59.457976 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:59.457977 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs podName:b932e53d-9993-47b8-a2cb-940fc759370d nodeName:}" failed. No retries permitted until 2026-04-16 14:53:15.457962614 +0000 UTC m=+34.221914704 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs") pod "network-metrics-daemon-twkfq" (UID: "b932e53d-9993-47b8-a2cb-940fc759370d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:59.558212 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:59.558177 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94s6x\" (UniqueName: \"kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x\") pod \"network-check-target-wr4kj\" (UID: \"d293eadf-7dbb-4770-b547-d28be07dbdf1\") " pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:52:59.558379 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:59.558360 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:59.558439 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:59.558384 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:59.558439 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:59.558393 2568 projected.go:194] Error preparing data for projected volume kube-api-access-94s6x for pod openshift-network-diagnostics/network-check-target-wr4kj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:59.558531 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:59.558451 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x podName:d293eadf-7dbb-4770-b547-d28be07dbdf1 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:15.558434238 +0000 UTC m=+34.322386343 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-94s6x" (UniqueName: "kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x") pod "network-check-target-wr4kj" (UID: "d293eadf-7dbb-4770-b547-d28be07dbdf1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:59.859351 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:52:59.859321 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:52:59.859765 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:52:59.859457 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:53:00.858842 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:00.858811 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:53:00.859018 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:00.858811 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:53:00.859018 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:00.858940 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zgf8" podUID="38ffecc9-f9c3-4e05-972f-ee25cb7922e6" Apr 16 14:53:00.859018 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:00.858999 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:53:00.870653 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:00.870628 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:53:00.871050 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:00.870741 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:53:00.871050 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:00.870787 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret podName:38ffecc9-f9c3-4e05-972f-ee25cb7922e6 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:08.87077047 +0000 UTC m=+27.634722558 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret") pod "global-pull-secret-syncer-2zgf8" (UID: "38ffecc9-f9c3-4e05-972f-ee25cb7922e6") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:53:01.858720 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.858559 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:53:01.858815 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:01.858799 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:53:01.959673 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.959449 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tp4rc" event={"ID":"927adbe7-9da4-4d4c-9bd5-3f36e9ef8978","Type":"ContainerStarted","Data":"472504488f7cda939bc28fb290a147ee7d4234b078dd5ef946bc293bcef896d5"} Apr 16 14:53:01.960850 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.960824 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bp89w" event={"ID":"04361303-4b68-4a7e-b73e-a4329bb6bb65","Type":"ContainerStarted","Data":"bb693a9ddcba21505e32f5dd65368fc23b12c63a1e02a9fbdb39a142b9845ce1"} Apr 16 14:53:01.962290 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.962263 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s956z" event={"ID":"f1f8e071-c6dd-48e5-b5d3-e096d8a24e92","Type":"ContainerStarted","Data":"f3f724821d4dfac04fc45f1421df18ae2203a6bbd9824ba7584e374dcc9ea6ae"} Apr 16 14:53:01.963620 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.963601 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ch67" event={"ID":"4148d88d-fea6-4539-8c24-81aff5a953f7","Type":"ContainerStarted","Data":"6020bfa3e0242a8f5f3239d630b6ecac8add739b122f5ec51fd7ca82b1fabe39"} Apr 16 14:53:01.966669 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.966652 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 14:53:01.966975 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.966946 2568 generic.go:358] "Generic (PLEG): container finished" podID="4e495dc8-7e20-4024-b33a-c9b6c6c1291f" containerID="8f78a395d2d398dd0eb0605ec9dfc94f96513e571293c0c3133e05ec95030892" exitCode=1 Apr 16 14:53:01.967039 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.967008 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" event={"ID":"4e495dc8-7e20-4024-b33a-c9b6c6c1291f","Type":"ContainerStarted","Data":"a4d3a37876ae62794ef10f58068ee337a54003f3a27819806fc92d9bc823370e"} Apr 16 14:53:01.967039 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.967029 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" event={"ID":"4e495dc8-7e20-4024-b33a-c9b6c6c1291f","Type":"ContainerStarted","Data":"2fc644d7a38d0e4ba5eba1752c4d8b1f8a3528891096767573bb8f1d8d03ee6d"} Apr 16 14:53:01.967116 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.967040 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" event={"ID":"4e495dc8-7e20-4024-b33a-c9b6c6c1291f","Type":"ContainerStarted","Data":"fe3c4061928bd0c65fd45dd7ac9783ccdfc69da9785e6609a3d35bb75cee0dce"} Apr 16 14:53:01.967116 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.967051 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" event={"ID":"4e495dc8-7e20-4024-b33a-c9b6c6c1291f","Type":"ContainerDied","Data":"8f78a395d2d398dd0eb0605ec9dfc94f96513e571293c0c3133e05ec95030892"} Apr 16 14:53:01.967116 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.967063 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" event={"ID":"4e495dc8-7e20-4024-b33a-c9b6c6c1291f","Type":"ContainerStarted","Data":"38b31086deadbdc9e24c49bb09a07957f44b71ec1de97ebdcc45186de2bb6d28"} Apr 16 14:53:01.968206 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.968188 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jscc7" event={"ID":"dd8296f9-c225-4ca9-8f12-e3aa03b02f50","Type":"ContainerStarted","Data":"63c82ff597b2b4de0ab8a06cb453547318f80bfa5e5bf0361a16ef7678e377c9"} Apr 16 14:53:01.969278 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.969259 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" event={"ID":"0d3f1567-ac34-4e7c-a3aa-732f78d80a79","Type":"ContainerStarted","Data":"367f3f4b100c1417633b1df121b4bd0bea1be019cbc43d53b5b32ef026c284d3"} Apr 16 14:53:01.970331 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.970309 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cvndk" event={"ID":"f6d1b1db-1136-4814-bd98-1f4c0d57bd3a","Type":"ContainerStarted","Data":"9bc99854dc0b89ffff1b06530f20826d1f6f6e744ebd8e31d1de3e04680a5552"} Apr 16 14:53:01.975557 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.975517 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tp4rc" podStartSLOduration=8.752296875999999 podStartE2EDuration="20.975504756s" podCreationTimestamp="2026-04-16 14:52:41 +0000 UTC" firstStartedPulling="2026-04-16 14:52:44.381575844 +0000 UTC m=+3.145527938" lastFinishedPulling="2026-04-16 14:52:56.604783725 +0000 UTC m=+15.368735818" observedRunningTime="2026-04-16 14:53:01.974913613 +0000 UTC m=+20.738865723" watchObservedRunningTime="2026-04-16 14:53:01.975504756 +0000 UTC m=+20.739456868" Apr 16 14:53:01.989696 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:01.989651 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s956z" podStartSLOduration=2.8877055130000002 podStartE2EDuration="19.98963508s" podCreationTimestamp="2026-04-16 14:52:42 +0000 UTC" firstStartedPulling="2026-04-16 14:52:44.390795659 +0000 UTC m=+3.154747757" lastFinishedPulling="2026-04-16 14:53:01.492725231 +0000 UTC m=+20.256677324" observedRunningTime="2026-04-16 14:53:01.989026393 +0000 UTC m=+20.752978505" watchObservedRunningTime="2026-04-16 14:53:01.98963508 +0000 UTC m=+20.753587191" Apr 16 14:53:02.008549 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.008509 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jscc7" podStartSLOduration=3.889309491 podStartE2EDuration="21.008498233s" podCreationTimestamp="2026-04-16 14:52:41 +0000 UTC" firstStartedPulling="2026-04-16 14:52:44.382354703 +0000 UTC m=+3.146306805" lastFinishedPulling="2026-04-16 14:53:01.501543445 +0000 UTC m=+20.265495547" observedRunningTime="2026-04-16 14:53:02.008313867 +0000 UTC m=+20.772265977" watchObservedRunningTime="2026-04-16 14:53:02.008498233 +0000 UTC m=+20.772450343" Apr 16 14:53:02.025919 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.025850 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bp89w" podStartSLOduration=2.92244259 podStartE2EDuration="20.025834901s" podCreationTimestamp="2026-04-16 14:52:42 +0000 UTC" firstStartedPulling="2026-04-16 14:52:44.391437198 +0000 UTC m=+3.155389289" lastFinishedPulling="2026-04-16 14:53:01.494829496 +0000 UTC m=+20.258781600" observedRunningTime="2026-04-16 14:53:02.024886141 +0000 UTC m=+20.788838263" watchObservedRunningTime="2026-04-16 14:53:02.025834901 +0000 UTC m=+20.789787012" Apr 16 14:53:02.038961 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.038933 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-cvndk" Apr 16 14:53:02.039635 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.039607 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-cvndk" Apr 16 14:53:02.060458 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.060414 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-cvndk" podStartSLOduration=2.955643208 podStartE2EDuration="20.060403627s" podCreationTimestamp="2026-04-16 14:52:42 +0000 UTC" firstStartedPulling="2026-04-16 14:52:44.387951682 +0000 UTC m=+3.151903773" lastFinishedPulling="2026-04-16 14:53:01.492712101 +0000 UTC m=+20.256664192" observedRunningTime="2026-04-16 14:53:02.059911685 +0000 UTC m=+20.823863795" watchObservedRunningTime="2026-04-16 14:53:02.060403627 +0000 UTC m=+20.824355738" Apr 16 14:53:02.670583 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.670563 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:53:02.791573 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.791468 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:53:02.670580111Z","UUID":"4ff68648-cb18-423c-9070-396102ff7018","Handler":null,"Name":"","Endpoint":""} Apr 16 14:53:02.792873 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.792849 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:53:02.792873 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.792870 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:53:02.859159 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.859141 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:53:02.859241 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:02.859217 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zgf8" podUID="38ffecc9-f9c3-4e05-972f-ee25cb7922e6" Apr 16 14:53:02.859241 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.859145 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:53:02.859317 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:02.859272 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:53:02.973070 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.973044 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" event={"ID":"0d3f1567-ac34-4e7c-a3aa-732f78d80a79","Type":"ContainerStarted","Data":"9dd2cea8be6f5c90caf1f62877c7dce5916078c8588e76883e433f37e3195c5b"} Apr 16 14:53:02.974182 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.974156 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cbb9d" event={"ID":"4fc0480c-be78-4a13-8001-9c955eec95e1","Type":"ContainerStarted","Data":"9e9018f7f48d90bd18c66af8601abb5f6e2b15b7da1a1b7294cbd5712aae4389"} Apr 16 14:53:02.975450 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.975432 2568 generic.go:358] "Generic (PLEG): container finished" podID="4148d88d-fea6-4539-8c24-81aff5a953f7" containerID="6020bfa3e0242a8f5f3239d630b6ecac8add739b122f5ec51fd7ca82b1fabe39" exitCode=0 Apr 16 14:53:02.975537 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.975490 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ch67" event={"ID":"4148d88d-fea6-4539-8c24-81aff5a953f7","Type":"ContainerDied","Data":"6020bfa3e0242a8f5f3239d630b6ecac8add739b122f5ec51fd7ca82b1fabe39"} Apr 16 14:53:02.977984 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.977959 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 14:53:02.978385 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.978356 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" event={"ID":"4e495dc8-7e20-4024-b33a-c9b6c6c1291f","Type":"ContainerStarted","Data":"e167afd7331c38298c2f5edbd062bbf22637e7aa7769f861e21421eb1ce91f44"} Apr 16 14:53:02.978987 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.978967 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-cvndk" Apr 16 14:53:02.979306 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.979291 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-cvndk" Apr 16 14:53:02.988837 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:02.988804 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-cbb9d" podStartSLOduration=4.910449281 podStartE2EDuration="21.988793728s" podCreationTimestamp="2026-04-16 14:52:41 +0000 UTC" firstStartedPulling="2026-04-16 14:52:44.391945394 +0000 UTC m=+3.155897483" lastFinishedPulling="2026-04-16 14:53:01.470289824 +0000 UTC m=+20.234241930" observedRunningTime="2026-04-16 14:53:02.988659178 +0000 UTC m=+21.752611294" watchObservedRunningTime="2026-04-16 14:53:02.988793728 +0000 UTC m=+21.752745838" Apr 16 14:53:03.859130 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:03.859102 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:53:03.859345 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:03.859223 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:53:03.982554 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:03.982518 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" event={"ID":"0d3f1567-ac34-4e7c-a3aa-732f78d80a79","Type":"ContainerStarted","Data":"5227265130cc9e83aaf9bc3baf135f4a0724e709134e8f49799bc8747684c131"} Apr 16 14:53:04.859015 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:04.858834 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:53:04.859172 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:04.858848 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:53:04.859172 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:04.859090 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:53:04.859267 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:04.859169 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zgf8" podUID="38ffecc9-f9c3-4e05-972f-ee25cb7922e6" Apr 16 14:53:04.987306 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:04.987279 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 14:53:04.987718 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:04.987676 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" event={"ID":"4e495dc8-7e20-4024-b33a-c9b6c6c1291f","Type":"ContainerStarted","Data":"f12574cefaced6c13bb48757bba1a2b8305c636e1240ce778514e1a8a8d3b902"} Apr 16 14:53:05.859404 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:05.859365 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:53:05.859625 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:05.859486 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:53:06.858506 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:06.858484 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:53:06.859095 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:06.858484 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:53:06.859095 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:06.858590 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:53:06.859095 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:06.858699 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zgf8" podUID="38ffecc9-f9c3-4e05-972f-ee25cb7922e6" Apr 16 14:53:06.995291 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:06.995155 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 14:53:06.995719 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:06.995687 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" event={"ID":"4e495dc8-7e20-4024-b33a-c9b6c6c1291f","Type":"ContainerStarted","Data":"2f3c366d80862e8072c763d7b6b5a8d52aba99fb6aa2cd021790fe1828ae663a"} Apr 16 14:53:06.996117 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:06.996089 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:53:06.996299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:06.996275 2568 scope.go:117] "RemoveContainer" containerID="8f78a395d2d398dd0eb0605ec9dfc94f96513e571293c0c3133e05ec95030892" Apr 16 14:53:07.016767 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:07.016205 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:53:07.026065 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:07.025580 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c98zb" podStartSLOduration=7.051119301 podStartE2EDuration="26.025564445s" podCreationTimestamp="2026-04-16 14:52:41 +0000 UTC" firstStartedPulling="2026-04-16 14:52:44.389913791 +0000 UTC m=+3.153865894" lastFinishedPulling="2026-04-16 14:53:03.364358935 +0000 UTC m=+22.128311038" observedRunningTime="2026-04-16 14:53:04.000835977 +0000 UTC m=+22.764788145" watchObservedRunningTime="2026-04-16 14:53:07.025564445 +0000 UTC m=+25.789516556" Apr 16 14:53:07.858866 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:07.858843 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:53:07.859167 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:07.858967 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:53:07.998708 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:07.998683 2568 generic.go:358] "Generic (PLEG): container finished" podID="4148d88d-fea6-4539-8c24-81aff5a953f7" containerID="34aa698541f7f671c1e60d19126d42988abb92dba9cbc2b5af57e57a1c4ad297" exitCode=0 Apr 16 14:53:07.998841 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:07.998748 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ch67" event={"ID":"4148d88d-fea6-4539-8c24-81aff5a953f7","Type":"ContainerDied","Data":"34aa698541f7f671c1e60d19126d42988abb92dba9cbc2b5af57e57a1c4ad297"} Apr 16 14:53:08.011398 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:08.011374 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 14:53:08.011690 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:08.011671 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" event={"ID":"4e495dc8-7e20-4024-b33a-c9b6c6c1291f","Type":"ContainerStarted","Data":"1da0afbe876438ba9628dbc38ee0b323f42425ef059d8ba5721e9c6996dcdd0b"} Apr 16 14:53:08.011943 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:08.011927 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:53:08.011995 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:08.011958 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:53:08.026043 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:08.026019 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:53:08.047645 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:08.047597 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" podStartSLOduration=9.819421864 podStartE2EDuration="27.047587211s" podCreationTimestamp="2026-04-16 14:52:41 +0000 UTC" firstStartedPulling="2026-04-16 14:52:44.383443911 +0000 UTC m=+3.147396006" lastFinishedPulling="2026-04-16 14:53:01.611609259 +0000 UTC m=+20.375561353" observedRunningTime="2026-04-16 14:53:08.047393226 +0000 UTC m=+26.811345336" watchObservedRunningTime="2026-04-16 14:53:08.047587211 +0000 UTC m=+26.811539321" Apr 16 14:53:08.826350 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:08.826194 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-twkfq"] Apr 16 14:53:08.826480 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:08.826462 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:53:08.826605 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:08.826583 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:53:08.826923 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:08.826883 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2zgf8"] Apr 16 14:53:08.827027 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:08.827015 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:53:08.827145 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:08.827127 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zgf8" podUID="38ffecc9-f9c3-4e05-972f-ee25cb7922e6" Apr 16 14:53:08.829079 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:08.829058 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wr4kj"] Apr 16 14:53:08.829178 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:08.829147 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:53:08.829223 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:08.829206 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:53:08.943171 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:08.943120 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:53:08.943472 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:08.943251 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:53:08.943472 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:08.943300 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret podName:38ffecc9-f9c3-4e05-972f-ee25cb7922e6 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:24.943285677 +0000 UTC m=+43.707237772 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret") pod "global-pull-secret-syncer-2zgf8" (UID: "38ffecc9-f9c3-4e05-972f-ee25cb7922e6") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:53:09.015111 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:09.015078 2568 generic.go:358] "Generic (PLEG): container finished" podID="4148d88d-fea6-4539-8c24-81aff5a953f7" containerID="d09aa578433bfd15f11edba63fe99d7f8c11d41d76fa3b97e9100239787b3369" exitCode=0 Apr 16 14:53:09.015207 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:09.015157 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ch67" event={"ID":"4148d88d-fea6-4539-8c24-81aff5a953f7","Type":"ContainerDied","Data":"d09aa578433bfd15f11edba63fe99d7f8c11d41d76fa3b97e9100239787b3369"} Apr 16 14:53:10.019387 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:10.019324 2568 generic.go:358] "Generic (PLEG): container finished" podID="4148d88d-fea6-4539-8c24-81aff5a953f7" containerID="1fb983735d94cfb61d2178436618afdbbc37782f05b47f780fb842091e44d96c" exitCode=0 Apr 16 14:53:10.019747 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:10.019404 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ch67" event={"ID":"4148d88d-fea6-4539-8c24-81aff5a953f7","Type":"ContainerDied","Data":"1fb983735d94cfb61d2178436618afdbbc37782f05b47f780fb842091e44d96c"} Apr 16 14:53:10.859252 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:10.859220 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:53:10.859778 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:10.859350 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:53:10.859778 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:10.859627 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:53:10.859778 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:10.859747 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:53:10.860014 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:10.859844 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:53:10.860014 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:10.859960 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zgf8" podUID="38ffecc9-f9c3-4e05-972f-ee25cb7922e6" Apr 16 14:53:12.858606 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:12.858573 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:53:12.858606 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:12.858592 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:53:12.859215 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:12.858685 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:53:12.859215 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:12.858702 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wr4kj" podUID="d293eadf-7dbb-4770-b547-d28be07dbdf1" Apr 16 14:53:12.859215 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:12.858791 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:53:12.859215 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:12.858860 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2zgf8" podUID="38ffecc9-f9c3-4e05-972f-ee25cb7922e6" Apr 16 14:53:14.573235 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.573210 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-83.ec2.internal" event="NodeReady" Apr 16 14:53:14.573766 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.573341 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:53:14.607277 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.607244 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-58b5c4dfdf-8xt9b"] Apr 16 14:53:14.635870 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.635849 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-r8fjc"] Apr 16 14:53:14.636033 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.636011 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.638805 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.638772 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:53:14.638947 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.638835 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:53:14.639029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.638975 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:53:14.639672 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.639585 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-q9vr8\"" Apr 16 14:53:14.644725 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.644695 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:53:14.651743 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.651724 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5drdr"] Apr 16 14:53:14.651877 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.651864 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:14.654580 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.654471 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:53:14.654580 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.654569 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:53:14.654733 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.654710 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nrznh\"" Apr 16 14:53:14.668585 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.668566 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-58b5c4dfdf-8xt9b"] Apr 16 14:53:14.668669 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.668593 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-r8fjc"] Apr 16 14:53:14.668669 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.668608 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5drdr"] Apr 16 14:53:14.668769 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.668679 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:53:14.671518 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.671494 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:53:14.671623 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.671599 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:53:14.671789 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.671774 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9k9b9\"" Apr 16 14:53:14.672019 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.672004 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:53:14.791338 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791311 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hcc4\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-kube-api-access-5hcc4\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.791497 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791356 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:14.791497 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791409 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-certificates\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.791497 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791442 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pbk4\" (UniqueName: \"kubernetes.io/projected/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-kube-api-access-2pbk4\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:14.791660 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791502 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-image-registry-private-configuration\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.791660 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791541 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-ca-trust-extracted\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.791660 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791579 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:53:14.791660 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791624 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-tmp-dir\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:14.791818 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791661 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-installation-pull-secrets\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.791818 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791678 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-bound-sa-token\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.791818 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791695 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.791818 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791743 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mpfb\" (UniqueName: \"kubernetes.io/projected/559b0245-445d-420e-9f45-367de89eecc7-kube-api-access-5mpfb\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:53:14.791818 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791772 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-config-volume\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:14.791818 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.791795 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-trusted-ca\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.858691 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.858635 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:53:14.858799 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.858636 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:53:14.858799 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.858636 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:53:14.860978 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.860955 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:53:14.861158 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.861135 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:53:14.861158 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.861154 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lnt9d\"" Apr 16 14:53:14.861324 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.861166 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z9txl\"" Apr 16 14:53:14.861324 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.861184 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:53:14.861324 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.861139 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:53:14.892816 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.892781 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mpfb\" (UniqueName: \"kubernetes.io/projected/559b0245-445d-420e-9f45-367de89eecc7-kube-api-access-5mpfb\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:53:14.892933 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.892833 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-config-volume\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:14.892933 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.892858 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-trusted-ca\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.892933 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.892904 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hcc4\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-kube-api-access-5hcc4\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.893098 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.892933 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:14.893098 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.892975 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-certificates\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.893098 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.893000 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pbk4\" (UniqueName: \"kubernetes.io/projected/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-kube-api-access-2pbk4\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:14.893098 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.893029 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-image-registry-private-configuration\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.893098 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:14.893093 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:14.893336 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:14.893169 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls podName:8e918a1e-6b74-48f2-a4ef-6164f643d8d7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:15.39314867 +0000 UTC m=+34.157100762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls") pod "dns-default-r8fjc" (UID: "8e918a1e-6b74-48f2-a4ef-6164f643d8d7") : secret "dns-default-metrics-tls" not found Apr 16 14:53:14.893336 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.893192 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-ca-trust-extracted\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.893336 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.893230 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:53:14.893336 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.893283 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-tmp-dir\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:14.893336 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.893314 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-installation-pull-secrets\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.893593 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.893339 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-bound-sa-token\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.893593 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.893368 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.893593 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:14.893378 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:14.893593 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:14.893436 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert podName:559b0245-445d-420e-9f45-367de89eecc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:15.393418331 +0000 UTC m=+34.157370433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert") pod "ingress-canary-5drdr" (UID: "559b0245-445d-420e-9f45-367de89eecc7") : secret "canary-serving-cert" not found Apr 16 14:53:14.893593 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:14.893505 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:14.893593 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:14.893522 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58b5c4dfdf-8xt9b: secret "image-registry-tls" not found Apr 16 14:53:14.893593 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.893571 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-certificates\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.893593 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:14.893583 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls podName:4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac nodeName:}" failed. No retries permitted until 2026-04-16 14:53:15.393567341 +0000 UTC m=+34.157519437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls") pod "image-registry-58b5c4dfdf-8xt9b" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac") : secret "image-registry-tls" not found Apr 16 14:53:14.894000 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.893571 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-config-volume\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:14.894000 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.893816 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-tmp-dir\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:14.894000 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.893833 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-trusted-ca\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.894000 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.893919 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-ca-trust-extracted\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.897906 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.897873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-installation-pull-secrets\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.898004 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.897884 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-image-registry-private-configuration\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.901707 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.901656 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mpfb\" (UniqueName: \"kubernetes.io/projected/559b0245-445d-420e-9f45-367de89eecc7-kube-api-access-5mpfb\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:53:14.901867 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.901845 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pbk4\" (UniqueName: \"kubernetes.io/projected/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-kube-api-access-2pbk4\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:14.901976 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.901940 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-bound-sa-token\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:14.901976 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:14.901959 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hcc4\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-kube-api-access-5hcc4\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:15.397581 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:15.397540 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:53:15.397838 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:15.397604 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:15.397838 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:15.397655 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:15.397838 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:15.397710 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:15.397838 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:15.397756 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:15.397838 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:15.397781 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert podName:559b0245-445d-420e-9f45-367de89eecc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:16.397761102 +0000 UTC m=+35.161713195 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert") pod "ingress-canary-5drdr" (UID: "559b0245-445d-420e-9f45-367de89eecc7") : secret "canary-serving-cert" not found Apr 16 14:53:15.397838 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:15.397805 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls podName:8e918a1e-6b74-48f2-a4ef-6164f643d8d7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:16.397789709 +0000 UTC m=+35.161741800 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls") pod "dns-default-r8fjc" (UID: "8e918a1e-6b74-48f2-a4ef-6164f643d8d7") : secret "dns-default-metrics-tls" not found Apr 16 14:53:15.397838 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:15.397822 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:15.397838 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:15.397836 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58b5c4dfdf-8xt9b: secret "image-registry-tls" not found Apr 16 14:53:15.398280 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:15.397884 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls podName:4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac nodeName:}" failed. No retries permitted until 2026-04-16 14:53:16.39787292 +0000 UTC m=+35.161825015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls") pod "image-registry-58b5c4dfdf-8xt9b" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac") : secret "image-registry-tls" not found Apr 16 14:53:15.498638 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:15.498611 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:53:15.498773 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:15.498748 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:15.498811 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:15.498803 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs podName:b932e53d-9993-47b8-a2cb-940fc759370d nodeName:}" failed. No retries permitted until 2026-04-16 14:53:47.498786657 +0000 UTC m=+66.262738758 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs") pod "network-metrics-daemon-twkfq" (UID: "b932e53d-9993-47b8-a2cb-940fc759370d") : secret "metrics-daemon-secret" not found Apr 16 14:53:15.599268 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:15.599242 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94s6x\" (UniqueName: \"kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x\") pod \"network-check-target-wr4kj\" (UID: \"d293eadf-7dbb-4770-b547-d28be07dbdf1\") " pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:53:15.601524 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:15.601501 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94s6x\" (UniqueName: \"kubernetes.io/projected/d293eadf-7dbb-4770-b547-d28be07dbdf1-kube-api-access-94s6x\") pod \"network-check-target-wr4kj\" (UID: \"d293eadf-7dbb-4770-b547-d28be07dbdf1\") " pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:53:15.778180 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:15.778126 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:53:16.041192 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:16.041039 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wr4kj"] Apr 16 14:53:16.048231 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:53:16.048205 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd293eadf_7dbb_4770_b547_d28be07dbdf1.slice/crio-5bfd74422bc88f9c3b287e3d5a60edef7614828ead9616f640f4cb15bda91ff3 WatchSource:0}: Error finding container 5bfd74422bc88f9c3b287e3d5a60edef7614828ead9616f640f4cb15bda91ff3: Status 404 returned error can't find the container with id 5bfd74422bc88f9c3b287e3d5a60edef7614828ead9616f640f4cb15bda91ff3 Apr 16 14:53:16.406147 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:16.406122 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:16.406272 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:16.406161 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:53:16.406272 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:16.406189 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:16.406272 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:16.406270 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:16.406466 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:16.406272 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:16.406466 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:16.406347 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls podName:8e918a1e-6b74-48f2-a4ef-6164f643d8d7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:18.406326854 +0000 UTC m=+37.170278956 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls") pod "dns-default-r8fjc" (UID: "8e918a1e-6b74-48f2-a4ef-6164f643d8d7") : secret "dns-default-metrics-tls" not found Apr 16 14:53:16.406466 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:16.406272 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:16.406466 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:16.406403 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert podName:559b0245-445d-420e-9f45-367de89eecc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:18.406391694 +0000 UTC m=+37.170343783 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert") pod "ingress-canary-5drdr" (UID: "559b0245-445d-420e-9f45-367de89eecc7") : secret "canary-serving-cert" not found Apr 16 14:53:16.406466 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:16.406281 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58b5c4dfdf-8xt9b: secret "image-registry-tls" not found Apr 16 14:53:16.406466 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:16.406430 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls podName:4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac nodeName:}" failed. No retries permitted until 2026-04-16 14:53:18.406423687 +0000 UTC m=+37.170375776 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls") pod "image-registry-58b5c4dfdf-8xt9b" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac") : secret "image-registry-tls" not found Apr 16 14:53:17.037262 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:17.037226 2568 generic.go:358] "Generic (PLEG): container finished" podID="4148d88d-fea6-4539-8c24-81aff5a953f7" containerID="9d6c850dd832de729bcd7181ff1fe0c10c075a158eb3a1249dc7ed2b1c5b3717" exitCode=0 Apr 16 14:53:17.037755 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:17.037314 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ch67" event={"ID":"4148d88d-fea6-4539-8c24-81aff5a953f7","Type":"ContainerDied","Data":"9d6c850dd832de729bcd7181ff1fe0c10c075a158eb3a1249dc7ed2b1c5b3717"} Apr 16 14:53:17.038504 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:17.038456 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wr4kj" event={"ID":"d293eadf-7dbb-4770-b547-d28be07dbdf1","Type":"ContainerStarted","Data":"5bfd74422bc88f9c3b287e3d5a60edef7614828ead9616f640f4cb15bda91ff3"} Apr 16 14:53:18.043029 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:18.042996 2568 generic.go:358] "Generic (PLEG): container finished" podID="4148d88d-fea6-4539-8c24-81aff5a953f7" containerID="8b881269b8c28adfd3cb3282b2f5e886d45db5a28e7d5fea24d4357fa1cda192" exitCode=0 Apr 16 14:53:18.043408 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:18.043071 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ch67" event={"ID":"4148d88d-fea6-4539-8c24-81aff5a953f7","Type":"ContainerDied","Data":"8b881269b8c28adfd3cb3282b2f5e886d45db5a28e7d5fea24d4357fa1cda192"} Apr 16 14:53:18.421407 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:18.421366 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:18.421561 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:18.421441 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:53:18.421561 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:18.421488 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:18.421561 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:18.421511 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:18.421710 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:18.421597 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:18.421710 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:18.421606 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:18.421710 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:18.421623 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58b5c4dfdf-8xt9b: secret "image-registry-tls" not found Apr 16 14:53:18.421710 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:18.421613 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls podName:8e918a1e-6b74-48f2-a4ef-6164f643d8d7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:22.421590513 +0000 UTC m=+41.185542605 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls") pod "dns-default-r8fjc" (UID: "8e918a1e-6b74-48f2-a4ef-6164f643d8d7") : secret "dns-default-metrics-tls" not found Apr 16 14:53:18.421710 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:18.421675 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls podName:4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac nodeName:}" failed. No retries permitted until 2026-04-16 14:53:22.421657723 +0000 UTC m=+41.185609819 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls") pod "image-registry-58b5c4dfdf-8xt9b" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac") : secret "image-registry-tls" not found Apr 16 14:53:18.421710 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:18.421689 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert podName:559b0245-445d-420e-9f45-367de89eecc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:22.421680477 +0000 UTC m=+41.185632569 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert") pod "ingress-canary-5drdr" (UID: "559b0245-445d-420e-9f45-367de89eecc7") : secret "canary-serving-cert" not found Apr 16 14:53:19.047631 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:19.047503 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ch67" event={"ID":"4148d88d-fea6-4539-8c24-81aff5a953f7","Type":"ContainerStarted","Data":"3c2dc5827f7266baf5d315ad7da7cc0d1d8c0d7b7ffb03f8e984d88cef42bf3e"} Apr 16 14:53:20.050405 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:20.050366 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wr4kj" event={"ID":"d293eadf-7dbb-4770-b547-d28be07dbdf1","Type":"ContainerStarted","Data":"46965262edfc1fe283c7859f42ea280f0d712fb052eec2ac2d940331f6fdc190"} Apr 16 14:53:20.050815 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:20.050615 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:53:20.066562 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:20.066517 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5ch67" podStartSLOduration=6.556815396 podStartE2EDuration="38.066504481s" podCreationTimestamp="2026-04-16 14:52:42 +0000 UTC" firstStartedPulling="2026-04-16 14:52:44.388240203 +0000 UTC m=+3.152192307" lastFinishedPulling="2026-04-16 14:53:15.897929292 +0000 UTC m=+34.661881392" observedRunningTime="2026-04-16 14:53:19.076301081 +0000 UTC m=+37.840253192" watchObservedRunningTime="2026-04-16 14:53:20.066504481 +0000 UTC m=+38.830456596" Apr 16 14:53:20.066873 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:20.066847 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-wr4kj" podStartSLOduration=36.137449213 podStartE2EDuration="39.066838724s" podCreationTimestamp="2026-04-16 14:52:41 +0000 UTC" firstStartedPulling="2026-04-16 14:53:16.050112118 +0000 UTC m=+34.814064208" lastFinishedPulling="2026-04-16 14:53:18.97950163 +0000 UTC m=+37.743453719" observedRunningTime="2026-04-16 14:53:20.065513318 +0000 UTC m=+38.829465430" watchObservedRunningTime="2026-04-16 14:53:20.066838724 +0000 UTC m=+38.830790835" Apr 16 14:53:22.450085 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:22.450040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:22.450538 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:22.450119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:22.450538 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:22.450170 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:53:22.450538 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:22.450177 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:22.450538 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:22.450201 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58b5c4dfdf-8xt9b: secret "image-registry-tls" not found Apr 16 14:53:22.450538 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:22.450257 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls podName:4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac nodeName:}" failed. No retries permitted until 2026-04-16 14:53:30.450236366 +0000 UTC m=+49.214188459 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls") pod "image-registry-58b5c4dfdf-8xt9b" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac") : secret "image-registry-tls" not found Apr 16 14:53:22.450538 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:22.450263 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:22.450538 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:22.450277 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:22.450538 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:22.450318 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls podName:8e918a1e-6b74-48f2-a4ef-6164f643d8d7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:30.45030353 +0000 UTC m=+49.214255618 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls") pod "dns-default-r8fjc" (UID: "8e918a1e-6b74-48f2-a4ef-6164f643d8d7") : secret "dns-default-metrics-tls" not found Apr 16 14:53:22.450538 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:22.450335 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert podName:559b0245-445d-420e-9f45-367de89eecc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:30.450327186 +0000 UTC m=+49.214279274 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert") pod "ingress-canary-5drdr" (UID: "559b0245-445d-420e-9f45-367de89eecc7") : secret "canary-serving-cert" not found Apr 16 14:53:24.965616 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:24.965581 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:53:24.970329 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:24.970297 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38ffecc9-f9c3-4e05-972f-ee25cb7922e6-original-pull-secret\") pod \"global-pull-secret-syncer-2zgf8\" (UID: \"38ffecc9-f9c3-4e05-972f-ee25cb7922e6\") " pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:53:25.083055 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:25.083031 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2zgf8" Apr 16 14:53:25.212500 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:25.212466 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2zgf8"] Apr 16 14:53:25.216362 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:53:25.216300 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38ffecc9_f9c3_4e05_972f_ee25cb7922e6.slice/crio-a326c75b8d212ca43dc2edb89d43473a1a26c56b05de099e1e9d6dcdac42c177 WatchSource:0}: Error finding container a326c75b8d212ca43dc2edb89d43473a1a26c56b05de099e1e9d6dcdac42c177: Status 404 returned error can't find the container with id a326c75b8d212ca43dc2edb89d43473a1a26c56b05de099e1e9d6dcdac42c177 Apr 16 14:53:26.061468 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:26.061428 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2zgf8" event={"ID":"38ffecc9-f9c3-4e05-972f-ee25cb7922e6","Type":"ContainerStarted","Data":"a326c75b8d212ca43dc2edb89d43473a1a26c56b05de099e1e9d6dcdac42c177"} Apr 16 14:53:30.070115 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:30.070073 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2zgf8" event={"ID":"38ffecc9-f9c3-4e05-972f-ee25cb7922e6","Type":"ContainerStarted","Data":"d96395ae6680928a0806b834054e52afbace5657df90d5a28b76aaae7fb3d444"} Apr 16 14:53:30.084577 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:30.084527 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-2zgf8" podStartSLOduration=33.224948755 podStartE2EDuration="37.084513977s" podCreationTimestamp="2026-04-16 14:52:53 +0000 UTC" firstStartedPulling="2026-04-16 14:53:25.218009598 +0000 UTC m=+43.981961687" lastFinishedPulling="2026-04-16 14:53:29.077574819 +0000 UTC m=+47.841526909" observedRunningTime="2026-04-16 14:53:30.083718452 +0000 UTC m=+48.847670563" watchObservedRunningTime="2026-04-16 14:53:30.084513977 +0000 UTC m=+48.848466088" Apr 16 14:53:30.500944 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:30.500858 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:30.501063 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:30.501001 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:30.501063 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:30.501012 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:53:30.501189 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:30.501066 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:30.501189 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:30.501092 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls podName:8e918a1e-6b74-48f2-a4ef-6164f643d8d7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:46.50107151 +0000 UTC m=+65.265023605 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls") pod "dns-default-r8fjc" (UID: "8e918a1e-6b74-48f2-a4ef-6164f643d8d7") : secret "dns-default-metrics-tls" not found Apr 16 14:53:30.501189 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:30.501136 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:30.501189 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:30.501176 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert podName:559b0245-445d-420e-9f45-367de89eecc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:46.501163331 +0000 UTC m=+65.265115435 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert") pod "ingress-canary-5drdr" (UID: "559b0245-445d-420e-9f45-367de89eecc7") : secret "canary-serving-cert" not found Apr 16 14:53:30.501189 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:30.501179 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:30.501377 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:30.501195 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58b5c4dfdf-8xt9b: secret "image-registry-tls" not found Apr 16 14:53:30.501377 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:30.501244 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls podName:4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac nodeName:}" failed. No retries permitted until 2026-04-16 14:53:46.501226855 +0000 UTC m=+65.265178973 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls") pod "image-registry-58b5c4dfdf-8xt9b" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac") : secret "image-registry-tls" not found Apr 16 14:53:40.038542 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:40.038513 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bff4l" Apr 16 14:53:46.508832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:46.508790 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:53:46.508832 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:46.508838 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:53:46.509274 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:46.508880 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:53:46.509274 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:46.508952 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:46.509274 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:46.508977 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58b5c4dfdf-8xt9b: secret "image-registry-tls" not found Apr 16 14:53:46.509274 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:46.509015 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:46.509274 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:46.509015 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:46.509274 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:46.509052 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls podName:4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac nodeName:}" failed. No retries permitted until 2026-04-16 14:54:18.509031187 +0000 UTC m=+97.272983333 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls") pod "image-registry-58b5c4dfdf-8xt9b" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac") : secret "image-registry-tls" not found Apr 16 14:53:46.509274 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:46.509090 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert podName:559b0245-445d-420e-9f45-367de89eecc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:18.509073364 +0000 UTC m=+97.273025452 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert") pod "ingress-canary-5drdr" (UID: "559b0245-445d-420e-9f45-367de89eecc7") : secret "canary-serving-cert" not found Apr 16 14:53:46.509274 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:46.509104 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls podName:8e918a1e-6b74-48f2-a4ef-6164f643d8d7 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:18.509096595 +0000 UTC m=+97.273048685 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls") pod "dns-default-r8fjc" (UID: "8e918a1e-6b74-48f2-a4ef-6164f643d8d7") : secret "dns-default-metrics-tls" not found Apr 16 14:53:47.515302 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:47.515270 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:53:47.515677 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:47.515404 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:47.515677 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:53:47.515479 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs podName:b932e53d-9993-47b8-a2cb-940fc759370d nodeName:}" failed. No retries permitted until 2026-04-16 14:54:51.515462825 +0000 UTC m=+130.279414915 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs") pod "network-metrics-daemon-twkfq" (UID: "b932e53d-9993-47b8-a2cb-940fc759370d") : secret "metrics-daemon-secret" not found Apr 16 14:53:52.056322 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:53:52.056291 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-wr4kj" Apr 16 14:54:18.521263 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:18.521226 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:54:18.521678 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:18.521278 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:54:18.521678 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:18.521388 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:54:18.521678 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:18.521406 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:54:18.521678 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:18.521410 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:54:18.521678 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:18.521469 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert podName:559b0245-445d-420e-9f45-367de89eecc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:22.521447142 +0000 UTC m=+161.285399250 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert") pod "ingress-canary-5drdr" (UID: "559b0245-445d-420e-9f45-367de89eecc7") : secret "canary-serving-cert" not found Apr 16 14:54:18.521678 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:18.521481 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:18.521678 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:18.521503 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58b5c4dfdf-8xt9b: secret "image-registry-tls" not found Apr 16 14:54:18.521678 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:18.521508 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls podName:8e918a1e-6b74-48f2-a4ef-6164f643d8d7 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:22.521497679 +0000 UTC m=+161.285449776 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls") pod "dns-default-r8fjc" (UID: "8e918a1e-6b74-48f2-a4ef-6164f643d8d7") : secret "dns-default-metrics-tls" not found Apr 16 14:54:18.521678 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:18.521603 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls podName:4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac nodeName:}" failed. No retries permitted until 2026-04-16 14:55:22.521581965 +0000 UTC m=+161.285534054 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls") pod "image-registry-58b5c4dfdf-8xt9b" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac") : secret "image-registry-tls" not found Apr 16 14:54:50.275715 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.275674 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-76456bfb6-dhbg6"] Apr 16 14:54:50.277788 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.277769 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:50.280113 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.280078 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:54:50.280113 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.280093 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:54:50.280299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.280177 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-jxmsl\"" Apr 16 14:54:50.280299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.280257 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 14:54:50.280977 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.280958 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 14:54:50.280977 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.280969 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 14:54:50.281127 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.281020 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 14:54:50.288172 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.288148 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-76456bfb6-dhbg6"] Apr 16 14:54:50.434572 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.434546 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:50.434717 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.434585 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-default-certificate\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:50.434717 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.434614 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:50.434717 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.434659 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-stats-auth\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:50.434822 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.434739 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl5sr\" (UniqueName: \"kubernetes.io/projected/b0f86a21-4db7-4159-a529-cd2f872688be-kube-api-access-jl5sr\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:50.535419 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.535364 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jl5sr\" (UniqueName: \"kubernetes.io/projected/b0f86a21-4db7-4159-a529-cd2f872688be-kube-api-access-jl5sr\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:50.535419 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.535412 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:50.535511 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.535459 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-default-certificate\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:50.535594 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:50.535572 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:50.535667 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.535646 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:50.535667 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:50.535658 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs podName:b0f86a21-4db7-4159-a529-cd2f872688be nodeName:}" failed. No retries permitted until 2026-04-16 14:54:51.035642376 +0000 UTC m=+129.799594468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs") pod "router-default-76456bfb6-dhbg6" (UID: "b0f86a21-4db7-4159-a529-cd2f872688be") : secret "router-metrics-certs-default" not found Apr 16 14:54:50.535758 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.535680 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-stats-auth\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:50.535844 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:50.535825 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle podName:b0f86a21-4db7-4159-a529-cd2f872688be nodeName:}" failed. No retries permitted until 2026-04-16 14:54:51.035803654 +0000 UTC m=+129.799755748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle") pod "router-default-76456bfb6-dhbg6" (UID: "b0f86a21-4db7-4159-a529-cd2f872688be") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:50.538119 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.538090 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-default-certificate\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:50.538812 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.538791 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-stats-auth\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:50.545528 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:50.545500 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl5sr\" (UniqueName: \"kubernetes.io/projected/b0f86a21-4db7-4159-a529-cd2f872688be-kube-api-access-jl5sr\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:51.039175 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:51.039135 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:51.039324 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:51.039199 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:51.039324 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:51.039279 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:51.039395 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:51.039331 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle podName:b0f86a21-4db7-4159-a529-cd2f872688be nodeName:}" failed. No retries permitted until 2026-04-16 14:54:52.039316509 +0000 UTC m=+130.803268598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle") pod "router-default-76456bfb6-dhbg6" (UID: "b0f86a21-4db7-4159-a529-cd2f872688be") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:51.039395 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:51.039348 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs podName:b0f86a21-4db7-4159-a529-cd2f872688be nodeName:}" failed. No retries permitted until 2026-04-16 14:54:52.039341565 +0000 UTC m=+130.803293654 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs") pod "router-default-76456bfb6-dhbg6" (UID: "b0f86a21-4db7-4159-a529-cd2f872688be") : secret "router-metrics-certs-default" not found Apr 16 14:54:51.542835 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:51.542813 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:54:51.543194 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:51.542971 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:54:51.543194 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:51.543037 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs podName:b932e53d-9993-47b8-a2cb-940fc759370d nodeName:}" failed. No retries permitted until 2026-04-16 14:56:53.543022112 +0000 UTC m=+252.306974206 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs") pod "network-metrics-daemon-twkfq" (UID: "b932e53d-9993-47b8-a2cb-940fc759370d") : secret "metrics-daemon-secret" not found Apr 16 14:54:52.047101 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:52.047072 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:52.047240 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:52.047129 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:52.047240 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:52.047169 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:52.047240 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:52.047235 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs podName:b0f86a21-4db7-4159-a529-cd2f872688be nodeName:}" failed. No retries permitted until 2026-04-16 14:54:54.047217366 +0000 UTC m=+132.811169485 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs") pod "router-default-76456bfb6-dhbg6" (UID: "b0f86a21-4db7-4159-a529-cd2f872688be") : secret "router-metrics-certs-default" not found Apr 16 14:54:52.047410 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:52.047254 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle podName:b0f86a21-4db7-4159-a529-cd2f872688be nodeName:}" failed. No retries permitted until 2026-04-16 14:54:54.047245222 +0000 UTC m=+132.811197317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle") pod "router-default-76456bfb6-dhbg6" (UID: "b0f86a21-4db7-4159-a529-cd2f872688be") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:54.061276 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:54.061244 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:54.061710 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:54.061324 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:54.061710 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:54.061418 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle podName:b0f86a21-4db7-4159-a529-cd2f872688be nodeName:}" failed. No retries permitted until 2026-04-16 14:54:58.061398157 +0000 UTC m=+136.825350251 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle") pod "router-default-76456bfb6-dhbg6" (UID: "b0f86a21-4db7-4159-a529-cd2f872688be") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:54.061710 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:54.061423 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:54.061710 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:54.061459 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs podName:b0f86a21-4db7-4159-a529-cd2f872688be nodeName:}" failed. No retries permitted until 2026-04-16 14:54:58.061451105 +0000 UTC m=+136.825403194 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs") pod "router-default-76456bfb6-dhbg6" (UID: "b0f86a21-4db7-4159-a529-cd2f872688be") : secret "router-metrics-certs-default" not found Apr 16 14:54:56.396127 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:56.396100 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s956z_f1f8e071-c6dd-48e5-b5d3-e096d8a24e92/dns-node-resolver/0.log" Apr 16 14:54:57.396379 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:57.396351 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tp4rc_927adbe7-9da4-4d4c-9bd5-3f36e9ef8978/node-ca/0.log" Apr 16 14:54:58.088368 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:58.088344 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:58.088500 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:54:58.088403 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:54:58.088500 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:58.088488 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:58.088573 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:58.088540 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs podName:b0f86a21-4db7-4159-a529-cd2f872688be nodeName:}" failed. No retries permitted until 2026-04-16 14:55:06.088527607 +0000 UTC m=+144.852479700 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs") pod "router-default-76456bfb6-dhbg6" (UID: "b0f86a21-4db7-4159-a529-cd2f872688be") : secret "router-metrics-certs-default" not found Apr 16 14:54:58.088573 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:54:58.088555 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle podName:b0f86a21-4db7-4159-a529-cd2f872688be nodeName:}" failed. No retries permitted until 2026-04-16 14:55:06.088547208 +0000 UTC m=+144.852499297 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle") pod "router-default-76456bfb6-dhbg6" (UID: "b0f86a21-4db7-4159-a529-cd2f872688be") : configmap references non-existent config key: service-ca.crt Apr 16 14:55:05.276693 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:05.276654 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-n77nq"] Apr 16 14:55:05.278695 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:05.278677 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-n77nq" Apr 16 14:55:05.281296 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:05.281266 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-tfdsw\"" Apr 16 14:55:05.281296 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:05.281285 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 14:55:05.282070 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:05.282052 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 14:55:05.293939 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:05.293915 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-n77nq"] Apr 16 14:55:05.441348 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:05.441320 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9gh9\" (UniqueName: \"kubernetes.io/projected/83beaa50-9f28-4ab9-8f66-e8b9202b7e30-kube-api-access-b9gh9\") pod \"migrator-64d4d94569-n77nq\" (UID: \"83beaa50-9f28-4ab9-8f66-e8b9202b7e30\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-n77nq" Apr 16 14:55:05.543625 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:05.543554 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9gh9\" (UniqueName: \"kubernetes.io/projected/83beaa50-9f28-4ab9-8f66-e8b9202b7e30-kube-api-access-b9gh9\") pod \"migrator-64d4d94569-n77nq\" (UID: \"83beaa50-9f28-4ab9-8f66-e8b9202b7e30\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-n77nq" Apr 16 14:55:05.556317 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:05.556294 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9gh9\" (UniqueName: \"kubernetes.io/projected/83beaa50-9f28-4ab9-8f66-e8b9202b7e30-kube-api-access-b9gh9\") pod \"migrator-64d4d94569-n77nq\" (UID: \"83beaa50-9f28-4ab9-8f66-e8b9202b7e30\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-n77nq" Apr 16 14:55:05.588267 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:05.588247 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-n77nq" Apr 16 14:55:05.699902 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:05.699860 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-n77nq"] Apr 16 14:55:05.702698 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:05.702672 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83beaa50_9f28_4ab9_8f66_e8b9202b7e30.slice/crio-6be2a1cf4fd89a2692e2b3842fbbcdfdf069ce51bf8b087925ff519f7b8e67e2 WatchSource:0}: Error finding container 6be2a1cf4fd89a2692e2b3842fbbcdfdf069ce51bf8b087925ff519f7b8e67e2: Status 404 returned error can't find the container with id 6be2a1cf4fd89a2692e2b3842fbbcdfdf069ce51bf8b087925ff519f7b8e67e2 Apr 16 14:55:06.148002 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:06.147975 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:55:06.148151 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:06.148132 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:55:06.148218 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:06.148146 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:55:06.148264 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:06.148231 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs podName:b0f86a21-4db7-4159-a529-cd2f872688be nodeName:}" failed. No retries permitted until 2026-04-16 14:55:22.14820814 +0000 UTC m=+160.912160235 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs") pod "router-default-76456bfb6-dhbg6" (UID: "b0f86a21-4db7-4159-a529-cd2f872688be") : secret "router-metrics-certs-default" not found Apr 16 14:55:06.148325 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:06.148307 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle podName:b0f86a21-4db7-4159-a529-cd2f872688be nodeName:}" failed. No retries permitted until 2026-04-16 14:55:22.148285963 +0000 UTC m=+160.912238058 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle") pod "router-default-76456bfb6-dhbg6" (UID: "b0f86a21-4db7-4159-a529-cd2f872688be") : configmap references non-existent config key: service-ca.crt Apr 16 14:55:06.238408 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:06.238367 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-n77nq" event={"ID":"83beaa50-9f28-4ab9-8f66-e8b9202b7e30","Type":"ContainerStarted","Data":"6be2a1cf4fd89a2692e2b3842fbbcdfdf069ce51bf8b087925ff519f7b8e67e2"} Apr 16 14:55:07.241770 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.241733 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-n77nq" event={"ID":"83beaa50-9f28-4ab9-8f66-e8b9202b7e30","Type":"ContainerStarted","Data":"6a9a9fee929ee3edffc6d2085f7ce407de36a28577fadc6f96a5c6fb8bbe909d"} Apr 16 14:55:07.241770 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.241773 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-n77nq" event={"ID":"83beaa50-9f28-4ab9-8f66-e8b9202b7e30","Type":"ContainerStarted","Data":"7d55511f9a32f25999c7776a5bc49d1eddcfd15b0635656616ed1c5581e9d99f"} Apr 16 14:55:07.260195 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.260156 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-n77nq" podStartSLOduration=1.386277594 podStartE2EDuration="2.260143016s" podCreationTimestamp="2026-04-16 14:55:05 +0000 UTC" firstStartedPulling="2026-04-16 14:55:05.704448264 +0000 UTC m=+144.468400356" lastFinishedPulling="2026-04-16 14:55:06.578313675 +0000 UTC m=+145.342265778" observedRunningTime="2026-04-16 14:55:07.259808287 +0000 UTC m=+146.023760399" watchObservedRunningTime="2026-04-16 14:55:07.260143016 +0000 UTC m=+146.024095128" Apr 16 14:55:07.679744 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.679718 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-grl9h"] Apr 16 14:55:07.681815 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.681797 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.684051 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.684025 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:55:07.684171 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.684090 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:55:07.684496 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.684477 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:55:07.685107 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.685084 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-2b2x9\"" Apr 16 14:55:07.685328 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.685310 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:55:07.697394 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.697369 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-grl9h"] Apr 16 14:55:07.863706 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.863680 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7wql\" (UniqueName: \"kubernetes.io/projected/d8bcd964-3f59-4080-bc40-db541883241c-kube-api-access-b7wql\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.863836 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.863751 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d8bcd964-3f59-4080-bc40-db541883241c-crio-socket\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.863836 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.863782 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d8bcd964-3f59-4080-bc40-db541883241c-data-volume\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.863836 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.863803 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.863967 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.863837 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d8bcd964-3f59-4080-bc40-db541883241c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.964999 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.964935 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d8bcd964-3f59-4080-bc40-db541883241c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.964999 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.964981 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7wql\" (UniqueName: \"kubernetes.io/projected/d8bcd964-3f59-4080-bc40-db541883241c-kube-api-access-b7wql\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.965135 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.965032 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d8bcd964-3f59-4080-bc40-db541883241c-crio-socket\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.965135 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.965061 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d8bcd964-3f59-4080-bc40-db541883241c-data-volume\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.965135 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.965080 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.965363 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.965148 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d8bcd964-3f59-4080-bc40-db541883241c-crio-socket\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.965363 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:07.965186 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:55:07.965363 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:07.965232 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls podName:d8bcd964-3f59-4080-bc40-db541883241c nodeName:}" failed. No retries permitted until 2026-04-16 14:55:08.465216287 +0000 UTC m=+147.229168379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-grl9h" (UID: "d8bcd964-3f59-4080-bc40-db541883241c") : secret "insights-runtime-extractor-tls" not found Apr 16 14:55:07.965538 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.965397 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d8bcd964-3f59-4080-bc40-db541883241c-data-volume\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.965538 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.965459 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d8bcd964-3f59-4080-bc40-db541883241c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:07.977691 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:07.977670 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7wql\" (UniqueName: \"kubernetes.io/projected/d8bcd964-3f59-4080-bc40-db541883241c-kube-api-access-b7wql\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:08.469115 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:08.469083 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:08.469459 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:08.469190 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:55:08.469459 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:08.469243 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls podName:d8bcd964-3f59-4080-bc40-db541883241c nodeName:}" failed. No retries permitted until 2026-04-16 14:55:09.469229807 +0000 UTC m=+148.233181896 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-grl9h" (UID: "d8bcd964-3f59-4080-bc40-db541883241c") : secret "insights-runtime-extractor-tls" not found Apr 16 14:55:09.475989 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:09.475954 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:09.476363 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:09.476069 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:55:09.476363 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:09.476122 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls podName:d8bcd964-3f59-4080-bc40-db541883241c nodeName:}" failed. No retries permitted until 2026-04-16 14:55:11.476108025 +0000 UTC m=+150.240060114 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-grl9h" (UID: "d8bcd964-3f59-4080-bc40-db541883241c") : secret "insights-runtime-extractor-tls" not found Apr 16 14:55:11.487122 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:11.487086 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:11.487499 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:11.487230 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:55:11.487499 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:11.487290 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls podName:d8bcd964-3f59-4080-bc40-db541883241c nodeName:}" failed. No retries permitted until 2026-04-16 14:55:15.487274432 +0000 UTC m=+154.251226525 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-grl9h" (UID: "d8bcd964-3f59-4080-bc40-db541883241c") : secret "insights-runtime-extractor-tls" not found Apr 16 14:55:15.515317 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:15.515291 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:15.517484 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:15.517461 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d8bcd964-3f59-4080-bc40-db541883241c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-grl9h\" (UID: \"d8bcd964-3f59-4080-bc40-db541883241c\") " pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:15.791088 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:15.791015 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-grl9h" Apr 16 14:55:15.905163 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:15.905135 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-grl9h"] Apr 16 14:55:15.909705 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:15.909678 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8bcd964_3f59_4080_bc40_db541883241c.slice/crio-3a07d2712d4dd1b6d53b0bf3483aff941701925c9684abae6c5d7f9b20b918f5 WatchSource:0}: Error finding container 3a07d2712d4dd1b6d53b0bf3483aff941701925c9684abae6c5d7f9b20b918f5: Status 404 returned error can't find the container with id 3a07d2712d4dd1b6d53b0bf3483aff941701925c9684abae6c5d7f9b20b918f5 Apr 16 14:55:16.263618 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:16.263587 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-grl9h" event={"ID":"d8bcd964-3f59-4080-bc40-db541883241c","Type":"ContainerStarted","Data":"b899c4360056183dbeaabdbf05ad6a28048ace605345b2368fc309679523c969"} Apr 16 14:55:16.263618 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:16.263622 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-grl9h" event={"ID":"d8bcd964-3f59-4080-bc40-db541883241c","Type":"ContainerStarted","Data":"3a07d2712d4dd1b6d53b0bf3483aff941701925c9684abae6c5d7f9b20b918f5"} Apr 16 14:55:17.267882 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:17.267847 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-grl9h" event={"ID":"d8bcd964-3f59-4080-bc40-db541883241c","Type":"ContainerStarted","Data":"3db4c30afc798b074958a75e58dc18744e53f7cddb3691307f5f12fb3f3146a1"} Apr 16 14:55:17.647472 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:17.647433 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" podUID="4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" Apr 16 14:55:17.661731 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:17.661706 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-r8fjc" podUID="8e918a1e-6b74-48f2-a4ef-6164f643d8d7" Apr 16 14:55:17.677708 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:17.677686 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-5drdr" podUID="559b0245-445d-420e-9f45-367de89eecc7" Apr 16 14:55:17.872208 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:17.872180 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-twkfq" podUID="b932e53d-9993-47b8-a2cb-940fc759370d" Apr 16 14:55:18.272550 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:18.272522 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:55:18.272990 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:18.272521 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-grl9h" event={"ID":"d8bcd964-3f59-4080-bc40-db541883241c","Type":"ContainerStarted","Data":"cf0ca648548ed54b65a99bb1b8160c326d0237b04d770fb7925dafc34ffb9d35"} Apr 16 14:55:18.272990 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:18.272649 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:55:18.272990 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:18.272674 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-r8fjc" Apr 16 14:55:18.289653 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:18.289612 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-grl9h" podStartSLOduration=9.575900687 podStartE2EDuration="11.28959915s" podCreationTimestamp="2026-04-16 14:55:07 +0000 UTC" firstStartedPulling="2026-04-16 14:55:15.958318379 +0000 UTC m=+154.722270472" lastFinishedPulling="2026-04-16 14:55:17.672016838 +0000 UTC m=+156.435968935" observedRunningTime="2026-04-16 14:55:18.288841119 +0000 UTC m=+157.052793232" watchObservedRunningTime="2026-04-16 14:55:18.28959915 +0000 UTC m=+157.053551293" Apr 16 14:55:22.165337 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.165302 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:55:22.165795 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.165467 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:55:22.166047 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.166030 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0f86a21-4db7-4159-a529-cd2f872688be-service-ca-bundle\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:55:22.167571 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.167553 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0f86a21-4db7-4159-a529-cd2f872688be-metrics-certs\") pod \"router-default-76456bfb6-dhbg6\" (UID: \"b0f86a21-4db7-4159-a529-cd2f872688be\") " pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:55:22.387966 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.387940 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:55:22.505204 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.505172 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-76456bfb6-dhbg6"] Apr 16 14:55:22.507863 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:22.507829 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0f86a21_4db7_4159_a529_cd2f872688be.slice/crio-8b3e43ffcb7a0c08881f205524b954c7c2cb476bee286f54827ccb4321133d4b WatchSource:0}: Error finding container 8b3e43ffcb7a0c08881f205524b954c7c2cb476bee286f54827ccb4321133d4b: Status 404 returned error can't find the container with id 8b3e43ffcb7a0c08881f205524b954c7c2cb476bee286f54827ccb4321133d4b Apr 16 14:55:22.568010 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.567985 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:55:22.568118 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.568033 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:55:22.568180 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.568143 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:55:22.570434 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.570415 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e918a1e-6b74-48f2-a4ef-6164f643d8d7-metrics-tls\") pod \"dns-default-r8fjc\" (UID: \"8e918a1e-6b74-48f2-a4ef-6164f643d8d7\") " pod="openshift-dns/dns-default-r8fjc" Apr 16 14:55:22.570541 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.570416 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559b0245-445d-420e-9f45-367de89eecc7-cert\") pod \"ingress-canary-5drdr\" (UID: \"559b0245-445d-420e-9f45-367de89eecc7\") " pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:55:22.570709 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.570691 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls\") pod \"image-registry-58b5c4dfdf-8xt9b\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:55:22.776720 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.776648 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nrznh\"" Apr 16 14:55:22.776720 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.776659 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9k9b9\"" Apr 16 14:55:22.776720 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.776710 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-q9vr8\"" Apr 16 14:55:22.783620 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.783599 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-r8fjc" Apr 16 14:55:22.783715 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.783621 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:55:22.783766 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.783738 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5drdr" Apr 16 14:55:22.917591 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:22.917538 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-r8fjc"] Apr 16 14:55:22.921287 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:22.921261 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e918a1e_6b74_48f2_a4ef_6164f643d8d7.slice/crio-1a07bba7579ac42fac6880d0095189510c5d85ab2c4429e5bedb2c0b020af385 WatchSource:0}: Error finding container 1a07bba7579ac42fac6880d0095189510c5d85ab2c4429e5bedb2c0b020af385: Status 404 returned error can't find the container with id 1a07bba7579ac42fac6880d0095189510c5d85ab2c4429e5bedb2c0b020af385 Apr 16 14:55:23.133678 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:23.133659 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5drdr"] Apr 16 14:55:23.136143 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:23.136123 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod559b0245_445d_420e_9f45_367de89eecc7.slice/crio-1d85cabff62db8ce396f86cbbcd1e09f4c287e7d54066fb719d3bda28b9ee481 WatchSource:0}: Error finding container 1d85cabff62db8ce396f86cbbcd1e09f4c287e7d54066fb719d3bda28b9ee481: Status 404 returned error can't find the container with id 1d85cabff62db8ce396f86cbbcd1e09f4c287e7d54066fb719d3bda28b9ee481 Apr 16 14:55:23.139224 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:23.139203 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-58b5c4dfdf-8xt9b"] Apr 16 14:55:23.142826 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:23.142799 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e28e16c_5b14_4b9f_8bf3_e89602d7b7ac.slice/crio-7851d44dc2f90ef19463d2394b7caac24ac08537b139804f0c78dc8f55701fe3 WatchSource:0}: Error finding container 7851d44dc2f90ef19463d2394b7caac24ac08537b139804f0c78dc8f55701fe3: Status 404 returned error can't find the container with id 7851d44dc2f90ef19463d2394b7caac24ac08537b139804f0c78dc8f55701fe3 Apr 16 14:55:23.286065 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:23.286040 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" event={"ID":"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac","Type":"ContainerStarted","Data":"bd9a4429c319f6625197cec25827994548aa7f7511a00280d949b537c36574b5"} Apr 16 14:55:23.286394 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:23.286069 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" event={"ID":"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac","Type":"ContainerStarted","Data":"7851d44dc2f90ef19463d2394b7caac24ac08537b139804f0c78dc8f55701fe3"} Apr 16 14:55:23.286394 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:23.286173 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:55:23.287352 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:23.287333 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-76456bfb6-dhbg6" event={"ID":"b0f86a21-4db7-4159-a529-cd2f872688be","Type":"ContainerStarted","Data":"beab7357f3e13d70ae7ec36475f5a1e73a374d8b003553faf209ac4a62373698"} Apr 16 14:55:23.287450 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:23.287356 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-76456bfb6-dhbg6" event={"ID":"b0f86a21-4db7-4159-a529-cd2f872688be","Type":"ContainerStarted","Data":"8b3e43ffcb7a0c08881f205524b954c7c2cb476bee286f54827ccb4321133d4b"} Apr 16 14:55:23.288312 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:23.288293 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5drdr" event={"ID":"559b0245-445d-420e-9f45-367de89eecc7","Type":"ContainerStarted","Data":"1d85cabff62db8ce396f86cbbcd1e09f4c287e7d54066fb719d3bda28b9ee481"} Apr 16 14:55:23.289240 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:23.289222 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r8fjc" event={"ID":"8e918a1e-6b74-48f2-a4ef-6164f643d8d7","Type":"ContainerStarted","Data":"1a07bba7579ac42fac6880d0095189510c5d85ab2c4429e5bedb2c0b020af385"} Apr 16 14:55:23.305100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:23.305059 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" podStartSLOduration=144.305048634 podStartE2EDuration="2m24.305048634s" podCreationTimestamp="2026-04-16 14:52:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:23.303822947 +0000 UTC m=+162.067775082" watchObservedRunningTime="2026-04-16 14:55:23.305048634 +0000 UTC m=+162.069000744" Apr 16 14:55:23.321591 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:23.321560 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-76456bfb6-dhbg6" podStartSLOduration=33.321551546 podStartE2EDuration="33.321551546s" podCreationTimestamp="2026-04-16 14:54:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:23.321014259 +0000 UTC m=+162.084966371" watchObservedRunningTime="2026-04-16 14:55:23.321551546 +0000 UTC m=+162.085503658" Apr 16 14:55:23.388504 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:23.388433 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:55:23.391459 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:23.391435 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:55:24.296314 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:24.296275 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:55:24.297624 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:24.297605 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-76456bfb6-dhbg6" Apr 16 14:55:25.299822 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:25.299791 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5drdr" event={"ID":"559b0245-445d-420e-9f45-367de89eecc7","Type":"ContainerStarted","Data":"d27042ea23d7266e9924bc8b28bd367cfcbe043a9876e7537e238eb90c8db5c1"} Apr 16 14:55:25.301251 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:25.301223 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r8fjc" event={"ID":"8e918a1e-6b74-48f2-a4ef-6164f643d8d7","Type":"ContainerStarted","Data":"f82dcb868f06fd45a7b7f6c796b4c3259d72d1eea66ce3164a4ecafe81df74c2"} Apr 16 14:55:25.301332 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:25.301261 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r8fjc" event={"ID":"8e918a1e-6b74-48f2-a4ef-6164f643d8d7","Type":"ContainerStarted","Data":"4b61767ecf4acde46449d1a3df2a4600aca4ade22729cf006e10ee973cdb4f86"} Apr 16 14:55:25.315952 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:25.315913 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5drdr" podStartSLOduration=129.71712554 podStartE2EDuration="2m11.315885533s" podCreationTimestamp="2026-04-16 14:53:14 +0000 UTC" firstStartedPulling="2026-04-16 14:55:23.138042018 +0000 UTC m=+161.901994107" lastFinishedPulling="2026-04-16 14:55:24.736802011 +0000 UTC m=+163.500754100" observedRunningTime="2026-04-16 14:55:25.315023337 +0000 UTC m=+164.078975449" watchObservedRunningTime="2026-04-16 14:55:25.315885533 +0000 UTC m=+164.079837622" Apr 16 14:55:25.330958 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:25.330921 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-r8fjc" podStartSLOduration=129.5215054 podStartE2EDuration="2m11.330911148s" podCreationTimestamp="2026-04-16 14:53:14 +0000 UTC" firstStartedPulling="2026-04-16 14:55:22.923152708 +0000 UTC m=+161.687104797" lastFinishedPulling="2026-04-16 14:55:24.732558441 +0000 UTC m=+163.496510545" observedRunningTime="2026-04-16 14:55:25.330566662 +0000 UTC m=+164.094518773" watchObservedRunningTime="2026-04-16 14:55:25.330911148 +0000 UTC m=+164.094863251" Apr 16 14:55:26.303451 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:26.303422 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-r8fjc" Apr 16 14:55:30.082537 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.082504 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-vq42l"] Apr 16 14:55:30.085405 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.085383 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vq42l" Apr 16 14:55:30.088231 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.088204 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-58b5c4dfdf-8xt9b"] Apr 16 14:55:30.089543 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.089521 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-8ksgx\"" Apr 16 14:55:30.097551 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.097525 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-vq42l"] Apr 16 14:55:30.118135 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.118111 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m9k7\" (UniqueName: \"kubernetes.io/projected/d2f3e8a7-9701-48e6-bdb2-7748ac1be0d1-kube-api-access-2m9k7\") pod \"network-check-source-7b678d77c7-vq42l\" (UID: \"d2f3e8a7-9701-48e6-bdb2-7748ac1be0d1\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vq42l" Apr 16 14:55:30.172255 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.172233 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-cpkz7"] Apr 16 14:55:30.175132 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.175114 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-cpkz7" Apr 16 14:55:30.177578 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.177560 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 14:55:30.177668 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.177597 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-qwsmw\"" Apr 16 14:55:30.185286 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.185268 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-cpkz7"] Apr 16 14:55:30.218969 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.218946 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2m9k7\" (UniqueName: \"kubernetes.io/projected/d2f3e8a7-9701-48e6-bdb2-7748ac1be0d1-kube-api-access-2m9k7\") pod \"network-check-source-7b678d77c7-vq42l\" (UID: \"d2f3e8a7-9701-48e6-bdb2-7748ac1be0d1\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vq42l" Apr 16 14:55:30.219071 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.218982 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/bbfb3dcf-c75d-40a8-8f0c-d34ee3c47a06-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-cpkz7\" (UID: \"bbfb3dcf-c75d-40a8-8f0c-d34ee3c47a06\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-cpkz7" Apr 16 14:55:30.227985 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.227961 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m9k7\" (UniqueName: \"kubernetes.io/projected/d2f3e8a7-9701-48e6-bdb2-7748ac1be0d1-kube-api-access-2m9k7\") pod \"network-check-source-7b678d77c7-vq42l\" (UID: \"d2f3e8a7-9701-48e6-bdb2-7748ac1be0d1\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vq42l" Apr 16 14:55:30.319983 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.319962 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/bbfb3dcf-c75d-40a8-8f0c-d34ee3c47a06-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-cpkz7\" (UID: \"bbfb3dcf-c75d-40a8-8f0c-d34ee3c47a06\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-cpkz7" Apr 16 14:55:30.322040 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.322016 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/bbfb3dcf-c75d-40a8-8f0c-d34ee3c47a06-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-cpkz7\" (UID: \"bbfb3dcf-c75d-40a8-8f0c-d34ee3c47a06\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-cpkz7" Apr 16 14:55:30.398407 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.398336 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vq42l" Apr 16 14:55:30.483251 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.483231 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-cpkz7" Apr 16 14:55:30.511853 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.511831 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-vq42l"] Apr 16 14:55:30.516376 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:30.516350 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2f3e8a7_9701_48e6_bdb2_7748ac1be0d1.slice/crio-42a4bed7ee6dd6cdb21df7467e327dff22507ef6d37f38614f4df18130056b6a WatchSource:0}: Error finding container 42a4bed7ee6dd6cdb21df7467e327dff22507ef6d37f38614f4df18130056b6a: Status 404 returned error can't find the container with id 42a4bed7ee6dd6cdb21df7467e327dff22507ef6d37f38614f4df18130056b6a Apr 16 14:55:30.602589 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.602485 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-cpkz7"] Apr 16 14:55:30.605023 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:30.605000 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbfb3dcf_c75d_40a8_8f0c_d34ee3c47a06.slice/crio-24e000fdbd66056e27bdc20a4f9688beb869d495587a3309211df447720dbf7b WatchSource:0}: Error finding container 24e000fdbd66056e27bdc20a4f9688beb869d495587a3309211df447720dbf7b: Status 404 returned error can't find the container with id 24e000fdbd66056e27bdc20a4f9688beb869d495587a3309211df447720dbf7b Apr 16 14:55:30.859170 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:30.859149 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:55:31.317485 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:31.317407 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-cpkz7" event={"ID":"bbfb3dcf-c75d-40a8-8f0c-d34ee3c47a06","Type":"ContainerStarted","Data":"24e000fdbd66056e27bdc20a4f9688beb869d495587a3309211df447720dbf7b"} Apr 16 14:55:31.318815 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:31.318789 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vq42l" event={"ID":"d2f3e8a7-9701-48e6-bdb2-7748ac1be0d1","Type":"ContainerStarted","Data":"bd50b0f7948d87b92fd6cafa527fbd75da4926864ed01514bfef843378691850"} Apr 16 14:55:31.318959 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:31.318818 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vq42l" event={"ID":"d2f3e8a7-9701-48e6-bdb2-7748ac1be0d1","Type":"ContainerStarted","Data":"42a4bed7ee6dd6cdb21df7467e327dff22507ef6d37f38614f4df18130056b6a"} Apr 16 14:55:31.337921 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:31.337871 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vq42l" podStartSLOduration=1.337860035 podStartE2EDuration="1.337860035s" podCreationTimestamp="2026-04-16 14:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:31.336500256 +0000 UTC m=+170.100452366" watchObservedRunningTime="2026-04-16 14:55:31.337860035 +0000 UTC m=+170.101812146" Apr 16 14:55:32.322939 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:32.322902 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-cpkz7" event={"ID":"bbfb3dcf-c75d-40a8-8f0c-d34ee3c47a06","Type":"ContainerStarted","Data":"7965e5f20a85a687af87d06b4e0813fa0e3859c888789eb649e608768adaae64"} Apr 16 14:55:32.339871 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:32.339826 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-cpkz7" podStartSLOduration=1.323199547 podStartE2EDuration="2.339812042s" podCreationTimestamp="2026-04-16 14:55:30 +0000 UTC" firstStartedPulling="2026-04-16 14:55:30.607076844 +0000 UTC m=+169.371028933" lastFinishedPulling="2026-04-16 14:55:31.623689334 +0000 UTC m=+170.387641428" observedRunningTime="2026-04-16 14:55:32.337837848 +0000 UTC m=+171.101789959" watchObservedRunningTime="2026-04-16 14:55:32.339812042 +0000 UTC m=+171.103764153" Apr 16 14:55:33.327434 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:33.327395 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-cpkz7" Apr 16 14:55:33.332437 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:33.332410 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-cpkz7" Apr 16 14:55:35.958697 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:35.958670 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68b858fb9c-w6zpx"] Apr 16 14:55:35.961764 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:35.961744 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:35.965198 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:35.965167 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:55:35.965198 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:35.965191 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:55:35.965416 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:35.965205 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:55:35.965416 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:35.965191 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:55:35.965416 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:35.965191 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:55:35.965416 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:35.965174 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:55:35.965416 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:35.965408 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:55:35.965634 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:35.965428 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lrkl7\"" Apr 16 14:55:35.974716 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:35.974695 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68b858fb9c-w6zpx"] Apr 16 14:55:36.060928 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.060887 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-console-config\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.061018 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.060931 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-oauth-serving-cert\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.061018 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.060949 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-service-ca\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.061018 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.060975 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ce2ed2-9325-417e-aacb-678b806ca490-console-serving-cert\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.061121 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.061029 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjf79\" (UniqueName: \"kubernetes.io/projected/a6ce2ed2-9325-417e-aacb-678b806ca490-kube-api-access-wjf79\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.061121 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.061068 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ce2ed2-9325-417e-aacb-678b806ca490-console-oauth-config\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.161583 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.161560 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjf79\" (UniqueName: \"kubernetes.io/projected/a6ce2ed2-9325-417e-aacb-678b806ca490-kube-api-access-wjf79\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.161670 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.161596 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ce2ed2-9325-417e-aacb-678b806ca490-console-oauth-config\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.161670 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.161641 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-console-config\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.161670 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.161665 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-oauth-serving-cert\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.161778 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.161691 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-service-ca\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.161778 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.161720 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ce2ed2-9325-417e-aacb-678b806ca490-console-serving-cert\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.162353 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.162334 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-oauth-serving-cert\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.162443 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.162417 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-service-ca\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.162961 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.162941 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-console-config\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.164178 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.164154 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ce2ed2-9325-417e-aacb-678b806ca490-console-oauth-config\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.164178 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.164167 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ce2ed2-9325-417e-aacb-678b806ca490-console-serving-cert\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.169747 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.169728 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjf79\" (UniqueName: \"kubernetes.io/projected/a6ce2ed2-9325-417e-aacb-678b806ca490-kube-api-access-wjf79\") pod \"console-68b858fb9c-w6zpx\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.271498 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.271446 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:36.308252 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.308229 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-r8fjc" Apr 16 14:55:36.391710 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:36.391686 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68b858fb9c-w6zpx"] Apr 16 14:55:36.395340 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:36.395312 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ce2ed2_9325_417e_aacb_678b806ca490.slice/crio-f3f83c649bcac90e68f16f7739fb6922d7778e6f78cd564dbb829e2ad173e04f WatchSource:0}: Error finding container f3f83c649bcac90e68f16f7739fb6922d7778e6f78cd564dbb829e2ad173e04f: Status 404 returned error can't find the container with id f3f83c649bcac90e68f16f7739fb6922d7778e6f78cd564dbb829e2ad173e04f Apr 16 14:55:37.340594 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:37.340562 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b858fb9c-w6zpx" event={"ID":"a6ce2ed2-9325-417e-aacb-678b806ca490","Type":"ContainerStarted","Data":"f3f83c649bcac90e68f16f7739fb6922d7778e6f78cd564dbb829e2ad173e04f"} Apr 16 14:55:39.347381 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.347348 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b858fb9c-w6zpx" event={"ID":"a6ce2ed2-9325-417e-aacb-678b806ca490","Type":"ContainerStarted","Data":"17688ab8249cbed387fe1f1a7a3ae8d24231d0998967a21e058373f8b9f32e6f"} Apr 16 14:55:39.365391 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.365350 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68b858fb9c-w6zpx" podStartSLOduration=1.961505845 podStartE2EDuration="4.365337367s" podCreationTimestamp="2026-04-16 14:55:35 +0000 UTC" firstStartedPulling="2026-04-16 14:55:36.397086817 +0000 UTC m=+175.161038906" lastFinishedPulling="2026-04-16 14:55:38.80091833 +0000 UTC m=+177.564870428" observedRunningTime="2026-04-16 14:55:39.36403717 +0000 UTC m=+178.127989281" watchObservedRunningTime="2026-04-16 14:55:39.365337367 +0000 UTC m=+178.129289477" Apr 16 14:55:39.881591 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.881563 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g"] Apr 16 14:55:39.884707 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.884692 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:39.887732 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.887709 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 14:55:39.887837 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.887731 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:55:39.887837 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.887712 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:55:39.888701 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.888685 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:55:39.888782 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.888710 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:55:39.888782 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.888727 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-qvd8x\"" Apr 16 14:55:39.900987 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.900966 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g"] Apr 16 14:55:39.949271 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.949249 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rt8hd"] Apr 16 14:55:39.952260 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.952246 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:39.954510 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.954495 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:55:39.954725 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.954704 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:55:39.954821 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.954782 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-48tx8\"" Apr 16 14:55:39.955019 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.955003 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:55:39.991536 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.991512 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx64p\" (UniqueName: \"kubernetes.io/projected/788fd48a-507a-4f66-8d27-4ea0596db9e1-kube-api-access-tx64p\") pod \"openshift-state-metrics-5669946b84-cnx9g\" (UID: \"788fd48a-507a-4f66-8d27-4ea0596db9e1\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:39.991641 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.991542 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/788fd48a-507a-4f66-8d27-4ea0596db9e1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-cnx9g\" (UID: \"788fd48a-507a-4f66-8d27-4ea0596db9e1\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:39.991641 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.991562 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/788fd48a-507a-4f66-8d27-4ea0596db9e1-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-cnx9g\" (UID: \"788fd48a-507a-4f66-8d27-4ea0596db9e1\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:39.991754 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:39.991659 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/788fd48a-507a-4f66-8d27-4ea0596db9e1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-cnx9g\" (UID: \"788fd48a-507a-4f66-8d27-4ea0596db9e1\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:40.092920 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.092875 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx64p\" (UniqueName: \"kubernetes.io/projected/788fd48a-507a-4f66-8d27-4ea0596db9e1-kube-api-access-tx64p\") pod \"openshift-state-metrics-5669946b84-cnx9g\" (UID: \"788fd48a-507a-4f66-8d27-4ea0596db9e1\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:40.093035 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.092939 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/788fd48a-507a-4f66-8d27-4ea0596db9e1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-cnx9g\" (UID: \"788fd48a-507a-4f66-8d27-4ea0596db9e1\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:40.093035 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.092963 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/788fd48a-507a-4f66-8d27-4ea0596db9e1-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-cnx9g\" (UID: \"788fd48a-507a-4f66-8d27-4ea0596db9e1\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:40.093035 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.092983 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78fd3b73-6d88-4569-acd8-b197add6650c-metrics-client-ca\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.093035 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.092999 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-wtmp\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.093265 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.093040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/788fd48a-507a-4f66-8d27-4ea0596db9e1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-cnx9g\" (UID: \"788fd48a-507a-4f66-8d27-4ea0596db9e1\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:40.093265 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.093065 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-accelerators-collector-config\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.093265 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.093092 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqc59\" (UniqueName: \"kubernetes.io/projected/78fd3b73-6d88-4569-acd8-b197add6650c-kube-api-access-hqc59\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.093265 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.093133 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-textfile\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.093265 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.093159 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/78fd3b73-6d88-4569-acd8-b197add6650c-sys\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.093265 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.093180 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/78fd3b73-6d88-4569-acd8-b197add6650c-root\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.093265 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.093225 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.093571 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.093271 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-tls\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.093746 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.093725 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/788fd48a-507a-4f66-8d27-4ea0596db9e1-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-cnx9g\" (UID: \"788fd48a-507a-4f66-8d27-4ea0596db9e1\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:40.094256 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.094237 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:55:40.095438 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.095419 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/788fd48a-507a-4f66-8d27-4ea0596db9e1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-cnx9g\" (UID: \"788fd48a-507a-4f66-8d27-4ea0596db9e1\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:40.095508 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.095483 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/788fd48a-507a-4f66-8d27-4ea0596db9e1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-cnx9g\" (UID: \"788fd48a-507a-4f66-8d27-4ea0596db9e1\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:40.100961 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.100941 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx64p\" (UniqueName: \"kubernetes.io/projected/788fd48a-507a-4f66-8d27-4ea0596db9e1-kube-api-access-tx64p\") pod \"openshift-state-metrics-5669946b84-cnx9g\" (UID: \"788fd48a-507a-4f66-8d27-4ea0596db9e1\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:40.193879 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.193819 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" Apr 16 14:55:40.193988 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.193964 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78fd3b73-6d88-4569-acd8-b197add6650c-metrics-client-ca\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.194031 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.193998 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-wtmp\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.194069 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.194034 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-accelerators-collector-config\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.194069 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.194054 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqc59\" (UniqueName: \"kubernetes.io/projected/78fd3b73-6d88-4569-acd8-b197add6650c-kube-api-access-hqc59\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.194166 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.194092 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-textfile\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.194166 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.194131 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/78fd3b73-6d88-4569-acd8-b197add6650c-sys\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.194166 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.194153 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/78fd3b73-6d88-4569-acd8-b197add6650c-root\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.194307 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.194175 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-wtmp\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.194307 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.194182 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.194307 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.194228 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-tls\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.194307 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.194238 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/78fd3b73-6d88-4569-acd8-b197add6650c-root\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.194307 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.194247 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/78fd3b73-6d88-4569-acd8-b197add6650c-sys\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.194578 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:40.194315 2568 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:55:40.194578 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:40.194356 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-tls podName:78fd3b73-6d88-4569-acd8-b197add6650c nodeName:}" failed. No retries permitted until 2026-04-16 14:55:40.694343007 +0000 UTC m=+179.458295096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-tls") pod "node-exporter-rt8hd" (UID: "78fd3b73-6d88-4569-acd8-b197add6650c") : secret "node-exporter-tls" not found Apr 16 14:55:40.194578 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.194380 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-textfile\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.194578 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.194505 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78fd3b73-6d88-4569-acd8-b197add6650c-metrics-client-ca\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.195622 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.195602 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-accelerators-collector-config\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.196241 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.196221 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.202650 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.202628 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqc59\" (UniqueName: \"kubernetes.io/projected/78fd3b73-6d88-4569-acd8-b197add6650c-kube-api-access-hqc59\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.315175 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.315133 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g"] Apr 16 14:55:40.317663 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:40.317638 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod788fd48a_507a_4f66_8d27_4ea0596db9e1.slice/crio-b133e29cb9b14be4e727f6ba78d552d9f62d1b1f572a4f7a110ad18557875a80 WatchSource:0}: Error finding container b133e29cb9b14be4e727f6ba78d552d9f62d1b1f572a4f7a110ad18557875a80: Status 404 returned error can't find the container with id b133e29cb9b14be4e727f6ba78d552d9f62d1b1f572a4f7a110ad18557875a80 Apr 16 14:55:40.350344 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.350318 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" event={"ID":"788fd48a-507a-4f66-8d27-4ea0596db9e1","Type":"ContainerStarted","Data":"b133e29cb9b14be4e727f6ba78d552d9f62d1b1f572a4f7a110ad18557875a80"} Apr 16 14:55:40.698642 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.698574 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-tls\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.700571 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.700547 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/78fd3b73-6d88-4569-acd8-b197add6650c-node-exporter-tls\") pod \"node-exporter-rt8hd\" (UID: \"78fd3b73-6d88-4569-acd8-b197add6650c\") " pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.860707 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.860688 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rt8hd" Apr 16 14:55:40.868478 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:40.868457 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78fd3b73_6d88_4569_acd8_b197add6650c.slice/crio-f26861334a51be8525135ba30962cb8ebe05a64df9ddaca39a46ebdd6ad01d76 WatchSource:0}: Error finding container f26861334a51be8525135ba30962cb8ebe05a64df9ddaca39a46ebdd6ad01d76: Status 404 returned error can't find the container with id f26861334a51be8525135ba30962cb8ebe05a64df9ddaca39a46ebdd6ad01d76 Apr 16 14:55:40.965926 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.965847 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:55:40.969552 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.969535 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:40.971848 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.971829 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:55:40.971959 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.971874 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:55:40.971959 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.971876 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:55:40.972087 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.971983 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:55:40.972087 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.971876 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:55:40.972442 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.972423 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:55:40.972442 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.972434 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:55:40.972442 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.972439 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:55:40.972646 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.972439 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:55:40.972646 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.972626 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-mppqf\"" Apr 16 14:55:40.992988 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:40.992965 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:55:41.102966 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.102935 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.103095 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.102973 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52z92\" (UniqueName: \"kubernetes.io/projected/271811e8-dfe5-4371-ae06-49ca4a0490fc-kube-api-access-52z92\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.103095 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.102994 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.103095 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.103014 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-web-config\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.103095 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.103063 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/271811e8-dfe5-4371-ae06-49ca4a0490fc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.103262 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.103166 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-config-volume\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.103262 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.103200 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.103262 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.103229 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/271811e8-dfe5-4371-ae06-49ca4a0490fc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.103385 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.103270 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.103385 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.103319 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/271811e8-dfe5-4371-ae06-49ca4a0490fc-config-out\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.103385 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.103343 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/271811e8-dfe5-4371-ae06-49ca4a0490fc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.103518 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.103394 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.103518 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.103471 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/271811e8-dfe5-4371-ae06-49ca4a0490fc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.204401 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.204373 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.204556 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.204429 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/271811e8-dfe5-4371-ae06-49ca4a0490fc-config-out\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.204621 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.204546 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/271811e8-dfe5-4371-ae06-49ca4a0490fc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.204621 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.204595 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.204723 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.204641 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/271811e8-dfe5-4371-ae06-49ca4a0490fc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.204723 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.204691 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.206147 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.204816 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52z92\" (UniqueName: \"kubernetes.io/projected/271811e8-dfe5-4371-ae06-49ca4a0490fc-kube-api-access-52z92\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.206147 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.204934 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.206147 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.204984 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-web-config\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.206147 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.205012 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/271811e8-dfe5-4371-ae06-49ca4a0490fc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.206147 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.205048 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-config-volume\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.206147 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.205083 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.206147 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.205116 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/271811e8-dfe5-4371-ae06-49ca4a0490fc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.206147 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.205408 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/271811e8-dfe5-4371-ae06-49ca4a0490fc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.206147 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.205524 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/271811e8-dfe5-4371-ae06-49ca4a0490fc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.206147 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.205815 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/271811e8-dfe5-4371-ae06-49ca4a0490fc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.208254 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.207787 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.208254 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.207852 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/271811e8-dfe5-4371-ae06-49ca4a0490fc-config-out\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.208254 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.208216 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.208531 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.208511 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/271811e8-dfe5-4371-ae06-49ca4a0490fc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.208863 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.208730 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-config-volume\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.209136 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.209068 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.209284 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.209252 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.209582 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.209561 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-web-config\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.210092 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.210077 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.214757 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.214738 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52z92\" (UniqueName: \"kubernetes.io/projected/271811e8-dfe5-4371-ae06-49ca4a0490fc-kube-api-access-52z92\") pod \"alertmanager-main-0\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.280780 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.280724 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:41.355353 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.355326 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" event={"ID":"788fd48a-507a-4f66-8d27-4ea0596db9e1","Type":"ContainerStarted","Data":"af097519bb5c6a0efdf373367a675e837624e367474e494fe09e8648d9969264"} Apr 16 14:55:41.355821 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.355366 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" event={"ID":"788fd48a-507a-4f66-8d27-4ea0596db9e1","Type":"ContainerStarted","Data":"66c21bd836935bc5f8323a3bf335c0354f108bc3a5f1867ce4013489e17cfda3"} Apr 16 14:55:41.357255 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.357220 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rt8hd" event={"ID":"78fd3b73-6d88-4569-acd8-b197add6650c","Type":"ContainerStarted","Data":"f26861334a51be8525135ba30962cb8ebe05a64df9ddaca39a46ebdd6ad01d76"} Apr 16 14:55:41.442435 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:41.442413 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:55:41.444294 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:41.444270 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod271811e8_dfe5_4371_ae06_49ca4a0490fc.slice/crio-72a02f18c82220b3823ec638d0911580d3c86e5c63db06d05169078b3c583927 WatchSource:0}: Error finding container 72a02f18c82220b3823ec638d0911580d3c86e5c63db06d05169078b3c583927: Status 404 returned error can't find the container with id 72a02f18c82220b3823ec638d0911580d3c86e5c63db06d05169078b3c583927 Apr 16 14:55:42.360782 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.360751 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerStarted","Data":"72a02f18c82220b3823ec638d0911580d3c86e5c63db06d05169078b3c583927"} Apr 16 14:55:42.362297 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.362274 2568 generic.go:358] "Generic (PLEG): container finished" podID="78fd3b73-6d88-4569-acd8-b197add6650c" containerID="d3c6558dfc7a344b0d74570288872253ac22c709f0c407edcd39828bdddd8027" exitCode=0 Apr 16 14:55:42.362451 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.362354 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rt8hd" event={"ID":"78fd3b73-6d88-4569-acd8-b197add6650c","Type":"ContainerDied","Data":"d3c6558dfc7a344b0d74570288872253ac22c709f0c407edcd39828bdddd8027"} Apr 16 14:55:42.364006 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.363986 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" event={"ID":"788fd48a-507a-4f66-8d27-4ea0596db9e1","Type":"ContainerStarted","Data":"d3838b7aa63359dd7755c11f969b54d1da5e7dc8d9476b1ee90abb05c5cb5d56"} Apr 16 14:55:42.394952 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.394911 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-cnx9g" podStartSLOduration=2.4866482850000002 podStartE2EDuration="3.3948824s" podCreationTimestamp="2026-04-16 14:55:39 +0000 UTC" firstStartedPulling="2026-04-16 14:55:40.452709746 +0000 UTC m=+179.216661836" lastFinishedPulling="2026-04-16 14:55:41.360943848 +0000 UTC m=+180.124895951" observedRunningTime="2026-04-16 14:55:42.394821567 +0000 UTC m=+181.158773681" watchObservedRunningTime="2026-04-16 14:55:42.3948824 +0000 UTC m=+181.158834512" Apr 16 14:55:42.864356 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.864326 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7f99fc98d-tlm2q"] Apr 16 14:55:42.867880 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.867863 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:42.870252 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.870212 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 14:55:42.870252 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.870211 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 14:55:42.870442 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.870310 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-ddfc7\"" Apr 16 14:55:42.870442 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.870331 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-v5sa0h708eln\"" Apr 16 14:55:42.870551 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.870465 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 14:55:42.870600 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.870579 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 14:55:42.870665 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.870651 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 14:55:42.878061 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:42.878038 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f99fc98d-tlm2q"] Apr 16 14:55:43.020112 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.020089 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.020196 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.020119 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-metrics-client-ca\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.020196 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.020154 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-grpc-tls\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.020196 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.020182 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.020307 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.020223 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.020307 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.020268 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-tls\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.020307 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.020287 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.020401 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.020308 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt6pw\" (UniqueName: \"kubernetes.io/projected/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-kube-api-access-xt6pw\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.121399 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.121355 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-grpc-tls\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.121399 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.121383 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.121531 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.121411 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.121531 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.121450 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-tls\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.121531 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.121476 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.121531 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.121502 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt6pw\" (UniqueName: \"kubernetes.io/projected/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-kube-api-access-xt6pw\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.122034 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.121740 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.122034 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.121782 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-metrics-client-ca\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.122609 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.122578 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-metrics-client-ca\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.124086 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.124062 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.124359 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.124333 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.124569 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.124549 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-tls\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.124652 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.124629 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.124738 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.124716 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.124837 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.124820 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-secret-grpc-tls\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.129818 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.129790 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt6pw\" (UniqueName: \"kubernetes.io/projected/6dd7fb5d-2333-471c-b3d2-1e65d5c362aa-kube-api-access-xt6pw\") pod \"thanos-querier-7f99fc98d-tlm2q\" (UID: \"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa\") " pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.177026 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.177000 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:43.291058 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.291029 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f99fc98d-tlm2q"] Apr 16 14:55:43.294074 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:43.294041 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dd7fb5d_2333_471c_b3d2_1e65d5c362aa.slice/crio-49a48e4e22bc672c668e6304b11c3e42b0867f76d6cfd0b3b57564611975bd51 WatchSource:0}: Error finding container 49a48e4e22bc672c668e6304b11c3e42b0867f76d6cfd0b3b57564611975bd51: Status 404 returned error can't find the container with id 49a48e4e22bc672c668e6304b11c3e42b0867f76d6cfd0b3b57564611975bd51 Apr 16 14:55:43.367639 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.367611 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" event={"ID":"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa","Type":"ContainerStarted","Data":"49a48e4e22bc672c668e6304b11c3e42b0867f76d6cfd0b3b57564611975bd51"} Apr 16 14:55:43.369097 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.369077 2568 generic.go:358] "Generic (PLEG): container finished" podID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerID="1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480" exitCode=0 Apr 16 14:55:43.369185 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.369140 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerDied","Data":"1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480"} Apr 16 14:55:43.371472 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.371419 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rt8hd" event={"ID":"78fd3b73-6d88-4569-acd8-b197add6650c","Type":"ContainerStarted","Data":"b5421b838bb98fab34c30053f80be5b8c3c27f0d2f3712990e48bc656f2ac9d0"} Apr 16 14:55:43.371472 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.371454 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rt8hd" event={"ID":"78fd3b73-6d88-4569-acd8-b197add6650c","Type":"ContainerStarted","Data":"a62634979e230fdeabc4855de91b09993be4efbe040d6e18c6bcd40629a5854e"} Apr 16 14:55:43.413483 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:43.413438 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rt8hd" podStartSLOduration=3.550048615 podStartE2EDuration="4.41342647s" podCreationTimestamp="2026-04-16 14:55:39 +0000 UTC" firstStartedPulling="2026-04-16 14:55:40.8700598 +0000 UTC m=+179.634011889" lastFinishedPulling="2026-04-16 14:55:41.733437646 +0000 UTC m=+180.497389744" observedRunningTime="2026-04-16 14:55:43.412142591 +0000 UTC m=+182.176094701" watchObservedRunningTime="2026-04-16 14:55:43.41342647 +0000 UTC m=+182.177378629" Apr 16 14:55:44.300591 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.300551 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-56dd544798-bkd2p"] Apr 16 14:55:44.303950 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.303929 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.306393 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.306376 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 14:55:44.306514 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.306396 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:55:44.306514 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.306376 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 14:55:44.307347 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.307325 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-hx8fz\"" Apr 16 14:55:44.307442 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.307354 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-f8vmasbklsv3i\"" Apr 16 14:55:44.307442 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.307365 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 14:55:44.314364 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.314345 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56dd544798-bkd2p"] Apr 16 14:55:44.433656 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.433626 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/aa3d4a56-7da8-4220-bf7c-17327a756284-metrics-server-audit-profiles\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.433656 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.433658 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/aa3d4a56-7da8-4220-bf7c-17327a756284-audit-log\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.434144 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.433739 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/aa3d4a56-7da8-4220-bf7c-17327a756284-secret-metrics-server-client-certs\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.434144 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.433802 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/aa3d4a56-7da8-4220-bf7c-17327a756284-secret-metrics-server-tls\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.434144 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.433926 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5t9d\" (UniqueName: \"kubernetes.io/projected/aa3d4a56-7da8-4220-bf7c-17327a756284-kube-api-access-b5t9d\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.434144 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.433966 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa3d4a56-7da8-4220-bf7c-17327a756284-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.434144 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.434137 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3d4a56-7da8-4220-bf7c-17327a756284-client-ca-bundle\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.535134 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.535100 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/aa3d4a56-7da8-4220-bf7c-17327a756284-secret-metrics-server-client-certs\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.535296 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.535148 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/aa3d4a56-7da8-4220-bf7c-17327a756284-secret-metrics-server-tls\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.535296 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.535203 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5t9d\" (UniqueName: \"kubernetes.io/projected/aa3d4a56-7da8-4220-bf7c-17327a756284-kube-api-access-b5t9d\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.535296 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.535231 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa3d4a56-7da8-4220-bf7c-17327a756284-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.535296 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.535273 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3d4a56-7da8-4220-bf7c-17327a756284-client-ca-bundle\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.535488 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.535304 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/aa3d4a56-7da8-4220-bf7c-17327a756284-metrics-server-audit-profiles\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.535488 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.535333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/aa3d4a56-7da8-4220-bf7c-17327a756284-audit-log\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.536019 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.535970 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/aa3d4a56-7da8-4220-bf7c-17327a756284-audit-log\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.536149 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.535998 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa3d4a56-7da8-4220-bf7c-17327a756284-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.536457 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.536434 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/aa3d4a56-7da8-4220-bf7c-17327a756284-metrics-server-audit-profiles\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.538098 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.538075 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3d4a56-7da8-4220-bf7c-17327a756284-client-ca-bundle\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.538207 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.538187 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/aa3d4a56-7da8-4220-bf7c-17327a756284-secret-metrics-server-tls\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.538273 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.538241 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/aa3d4a56-7da8-4220-bf7c-17327a756284-secret-metrics-server-client-certs\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.543348 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.543330 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5t9d\" (UniqueName: \"kubernetes.io/projected/aa3d4a56-7da8-4220-bf7c-17327a756284-kube-api-access-b5t9d\") pod \"metrics-server-56dd544798-bkd2p\" (UID: \"aa3d4a56-7da8-4220-bf7c-17327a756284\") " pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.615235 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.615203 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:55:44.664773 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.661791 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx"] Apr 16 14:55:44.666099 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.666076 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx" Apr 16 14:55:44.668695 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.668670 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 14:55:44.668798 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.668759 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-p9tkk\"" Apr 16 14:55:44.672948 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.672909 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx"] Apr 16 14:55:44.738041 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.738015 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-vgksx\" (UID: \"da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx" Apr 16 14:55:44.838745 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:44.838722 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-vgksx\" (UID: \"da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx" Apr 16 14:55:44.838929 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:44.838910 2568 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 14:55:44.839028 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:44.839010 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45-monitoring-plugin-cert podName:da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:45.338993728 +0000 UTC m=+184.102945818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-vgksx" (UID: "da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45") : secret "monitoring-plugin-cert" not found Apr 16 14:55:45.344013 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:45.343800 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-vgksx\" (UID: \"da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx" Apr 16 14:55:45.346420 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:45.346396 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-vgksx\" (UID: \"da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx" Apr 16 14:55:45.368194 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:45.368169 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56dd544798-bkd2p"] Apr 16 14:55:45.371386 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:45.371343 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa3d4a56_7da8_4220_bf7c_17327a756284.slice/crio-c720f39a44b78b26fd43fdac89e1e2d317df705b820f05785e234caa49c623ee WatchSource:0}: Error finding container c720f39a44b78b26fd43fdac89e1e2d317df705b820f05785e234caa49c623ee: Status 404 returned error can't find the container with id c720f39a44b78b26fd43fdac89e1e2d317df705b820f05785e234caa49c623ee Apr 16 14:55:45.381339 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:45.381300 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerStarted","Data":"8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac"} Apr 16 14:55:45.382799 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:45.382770 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" event={"ID":"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa","Type":"ContainerStarted","Data":"08286840e92dd3d745b833042939959926e3770ae0f8b0728d92b81f85fd1842"} Apr 16 14:55:45.384383 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:45.384359 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" event={"ID":"aa3d4a56-7da8-4220-bf7c-17327a756284","Type":"ContainerStarted","Data":"c720f39a44b78b26fd43fdac89e1e2d317df705b820f05785e234caa49c623ee"} Apr 16 14:55:45.577458 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:45.577331 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx" Apr 16 14:55:45.706674 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:45.706488 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx"] Apr 16 14:55:45.708842 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:45.708818 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda5dbec6_5b51_4f5a_85cf_f08dd3b4ff45.slice/crio-d8f681689a3384d6111a9e6b8051e28db9510010b55c2afd74d6c55f6e03ed4b WatchSource:0}: Error finding container d8f681689a3384d6111a9e6b8051e28db9510010b55c2afd74d6c55f6e03ed4b: Status 404 returned error can't find the container with id d8f681689a3384d6111a9e6b8051e28db9510010b55c2afd74d6c55f6e03ed4b Apr 16 14:55:46.271819 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.271780 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:46.271819 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.271826 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:46.276863 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.276840 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:46.366734 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.366710 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bbb454cc-k9vzf"] Apr 16 14:55:46.369967 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.369944 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.378709 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.377962 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:55:46.380552 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.380526 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bbb454cc-k9vzf"] Apr 16 14:55:46.396701 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.396646 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerStarted","Data":"9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3"} Apr 16 14:55:46.396825 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.396754 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerStarted","Data":"cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2"} Apr 16 14:55:46.396825 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.396811 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerStarted","Data":"5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7"} Apr 16 14:55:46.396825 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.396823 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerStarted","Data":"aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61"} Apr 16 14:55:46.397844 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.397819 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx" event={"ID":"da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45","Type":"ContainerStarted","Data":"d8f681689a3384d6111a9e6b8051e28db9510010b55c2afd74d6c55f6e03ed4b"} Apr 16 14:55:46.399833 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.399805 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" event={"ID":"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa","Type":"ContainerStarted","Data":"dab65236221dba67de48a457f7166e6c9b085c51a5d6611574b8a4d37cbbeb63"} Apr 16 14:55:46.399952 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.399836 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" event={"ID":"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa","Type":"ContainerStarted","Data":"9ef8d4c72dff2209bdefdc037a2870c93c0a772b6f177dc694abfb25b6879198"} Apr 16 14:55:46.404348 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.404325 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:55:46.454194 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.453914 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-console-config\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.454194 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.453959 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjqmv\" (UniqueName: \"kubernetes.io/projected/f91cc132-1c05-4c85-a902-2b8622573dd1-kube-api-access-tjqmv\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.454194 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.454037 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f91cc132-1c05-4c85-a902-2b8622573dd1-console-oauth-config\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.454194 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.454080 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-service-ca\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.454194 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.454135 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f91cc132-1c05-4c85-a902-2b8622573dd1-console-serving-cert\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.454194 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.454158 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-oauth-serving-cert\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.454549 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.454252 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-trusted-ca-bundle\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.556413 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.555469 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f91cc132-1c05-4c85-a902-2b8622573dd1-console-serving-cert\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.556413 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.555509 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-oauth-serving-cert\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.556413 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.555776 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-trusted-ca-bundle\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.556413 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.555846 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-console-config\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.556413 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.555918 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjqmv\" (UniqueName: \"kubernetes.io/projected/f91cc132-1c05-4c85-a902-2b8622573dd1-kube-api-access-tjqmv\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.556413 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.555996 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f91cc132-1c05-4c85-a902-2b8622573dd1-console-oauth-config\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.556413 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.556050 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-service-ca\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.556413 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.556210 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-oauth-serving-cert\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.556941 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.556538 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-trusted-ca-bundle\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.556941 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.556814 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-console-config\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.556941 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.556855 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-service-ca\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.558930 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.558869 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f91cc132-1c05-4c85-a902-2b8622573dd1-console-serving-cert\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.559049 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.559008 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f91cc132-1c05-4c85-a902-2b8622573dd1-console-oauth-config\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.567598 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.567557 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjqmv\" (UniqueName: \"kubernetes.io/projected/f91cc132-1c05-4c85-a902-2b8622573dd1-kube-api-access-tjqmv\") pod \"console-7bbb454cc-k9vzf\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.682066 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.682044 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:46.839067 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:46.839021 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bbb454cc-k9vzf"] Apr 16 14:55:47.153547 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:55:47.153516 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf91cc132_1c05_4c85_a902_2b8622573dd1.slice/crio-9fb283b08261136b98b30e9e4f935c7e9c390a4b1af3d2e8315d54e5ec7df137 WatchSource:0}: Error finding container 9fb283b08261136b98b30e9e4f935c7e9c390a4b1af3d2e8315d54e5ec7df137: Status 404 returned error can't find the container with id 9fb283b08261136b98b30e9e4f935c7e9c390a4b1af3d2e8315d54e5ec7df137 Apr 16 14:55:47.404844 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.404782 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" event={"ID":"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa","Type":"ContainerStarted","Data":"5f5d25516ecbc9a005836048caf74f8dfc4bf6e783d4ec555e399dd79e027395"} Apr 16 14:55:47.404844 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.404814 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" event={"ID":"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa","Type":"ContainerStarted","Data":"1d29312e3a82f3a733ad0b0fa8acf167203ff9b054f096a1092ba692e1d6f40c"} Apr 16 14:55:47.404844 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.404825 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" event={"ID":"6dd7fb5d-2333-471c-b3d2-1e65d5c362aa","Type":"ContainerStarted","Data":"29c5aab129cf212cab18e0ad2dc0b2f300824f73af7bf72b878f366c76126067"} Apr 16 14:55:47.405056 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.404962 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:47.406217 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.406196 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" event={"ID":"aa3d4a56-7da8-4220-bf7c-17327a756284","Type":"ContainerStarted","Data":"301e93c6f1841582673299f26c9ec796df91301da6da15c3febb82393354b0e8"} Apr 16 14:55:47.407530 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.407512 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx" event={"ID":"da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45","Type":"ContainerStarted","Data":"2dbc16aa092ef34cdce1279fe05a2017ff8e375fa5b33e2d5e329736e0735041"} Apr 16 14:55:47.407743 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.407717 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx" Apr 16 14:55:47.408944 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.408921 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bbb454cc-k9vzf" event={"ID":"f91cc132-1c05-4c85-a902-2b8622573dd1","Type":"ContainerStarted","Data":"73d0e2b85bc614be6f669464c519c8f2f2092bfa6504378fd209cfe9abb72001"} Apr 16 14:55:47.409039 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.408951 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bbb454cc-k9vzf" event={"ID":"f91cc132-1c05-4c85-a902-2b8622573dd1","Type":"ContainerStarted","Data":"9fb283b08261136b98b30e9e4f935c7e9c390a4b1af3d2e8315d54e5ec7df137"} Apr 16 14:55:47.412115 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.412090 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerStarted","Data":"0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03"} Apr 16 14:55:47.413210 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.413191 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx" Apr 16 14:55:47.429761 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.429723 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" podStartSLOduration=2.036379264 podStartE2EDuration="5.429712014s" podCreationTimestamp="2026-04-16 14:55:42 +0000 UTC" firstStartedPulling="2026-04-16 14:55:43.295851404 +0000 UTC m=+182.059803496" lastFinishedPulling="2026-04-16 14:55:46.689184151 +0000 UTC m=+185.453136246" observedRunningTime="2026-04-16 14:55:47.428350406 +0000 UTC m=+186.192302528" watchObservedRunningTime="2026-04-16 14:55:47.429712014 +0000 UTC m=+186.193664125" Apr 16 14:55:47.447516 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.447483 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bbb454cc-k9vzf" podStartSLOduration=1.447472296 podStartE2EDuration="1.447472296s" podCreationTimestamp="2026-04-16 14:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:47.44670629 +0000 UTC m=+186.210658403" watchObservedRunningTime="2026-04-16 14:55:47.447472296 +0000 UTC m=+186.211424408" Apr 16 14:55:47.475331 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.475290 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.231098151 podStartE2EDuration="7.475280584s" podCreationTimestamp="2026-04-16 14:55:40 +0000 UTC" firstStartedPulling="2026-04-16 14:55:41.44678621 +0000 UTC m=+180.210738302" lastFinishedPulling="2026-04-16 14:55:46.690968631 +0000 UTC m=+185.454920735" observedRunningTime="2026-04-16 14:55:47.474002208 +0000 UTC m=+186.237954333" watchObservedRunningTime="2026-04-16 14:55:47.475280584 +0000 UTC m=+186.239232695" Apr 16 14:55:47.501100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.501050 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vgksx" podStartSLOduration=2.004298423 podStartE2EDuration="3.501035042s" podCreationTimestamp="2026-04-16 14:55:44 +0000 UTC" firstStartedPulling="2026-04-16 14:55:45.710682823 +0000 UTC m=+184.474634915" lastFinishedPulling="2026-04-16 14:55:47.207419442 +0000 UTC m=+185.971371534" observedRunningTime="2026-04-16 14:55:47.499430348 +0000 UTC m=+186.263382461" watchObservedRunningTime="2026-04-16 14:55:47.501035042 +0000 UTC m=+186.264987154" Apr 16 14:55:47.529732 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:47.529683 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" podStartSLOduration=2.214251031 podStartE2EDuration="3.529672484s" podCreationTimestamp="2026-04-16 14:55:44 +0000 UTC" firstStartedPulling="2026-04-16 14:55:45.373662806 +0000 UTC m=+184.137614902" lastFinishedPulling="2026-04-16 14:55:46.689084257 +0000 UTC m=+185.453036355" observedRunningTime="2026-04-16 14:55:47.528314945 +0000 UTC m=+186.292267060" watchObservedRunningTime="2026-04-16 14:55:47.529672484 +0000 UTC m=+186.293624596" Apr 16 14:55:53.422640 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:53.422610 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7f99fc98d-tlm2q" Apr 16 14:55:55.112811 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.112760 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" podUID="4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" containerName="registry" containerID="cri-o://bd9a4429c319f6625197cec25827994548aa7f7511a00280d949b537c36574b5" gracePeriod=30 Apr 16 14:55:55.351592 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.351570 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:55:55.437412 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.437343 2568 generic.go:358] "Generic (PLEG): container finished" podID="4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" containerID="bd9a4429c319f6625197cec25827994548aa7f7511a00280d949b537c36574b5" exitCode=0 Apr 16 14:55:55.437412 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.437402 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" Apr 16 14:55:55.437585 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.437420 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" event={"ID":"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac","Type":"ContainerDied","Data":"bd9a4429c319f6625197cec25827994548aa7f7511a00280d949b537c36574b5"} Apr 16 14:55:55.437585 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.437452 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58b5c4dfdf-8xt9b" event={"ID":"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac","Type":"ContainerDied","Data":"7851d44dc2f90ef19463d2394b7caac24ac08537b139804f0c78dc8f55701fe3"} Apr 16 14:55:55.437585 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.437467 2568 scope.go:117] "RemoveContainer" containerID="bd9a4429c319f6625197cec25827994548aa7f7511a00280d949b537c36574b5" Apr 16 14:55:55.446763 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.446746 2568 scope.go:117] "RemoveContainer" containerID="bd9a4429c319f6625197cec25827994548aa7f7511a00280d949b537c36574b5" Apr 16 14:55:55.447023 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:55:55.447000 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd9a4429c319f6625197cec25827994548aa7f7511a00280d949b537c36574b5\": container with ID starting with bd9a4429c319f6625197cec25827994548aa7f7511a00280d949b537c36574b5 not found: ID does not exist" containerID="bd9a4429c319f6625197cec25827994548aa7f7511a00280d949b537c36574b5" Apr 16 14:55:55.447083 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.447032 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9a4429c319f6625197cec25827994548aa7f7511a00280d949b537c36574b5"} err="failed to get container status \"bd9a4429c319f6625197cec25827994548aa7f7511a00280d949b537c36574b5\": rpc error: code = NotFound desc = could not find container \"bd9a4429c319f6625197cec25827994548aa7f7511a00280d949b537c36574b5\": container with ID starting with bd9a4429c319f6625197cec25827994548aa7f7511a00280d949b537c36574b5 not found: ID does not exist" Apr 16 14:55:55.525695 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.525672 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls\") pod \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " Apr 16 14:55:55.525789 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.525708 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-ca-trust-extracted\") pod \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " Apr 16 14:55:55.525789 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.525745 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-bound-sa-token\") pod \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " Apr 16 14:55:55.525789 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.525778 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hcc4\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-kube-api-access-5hcc4\") pod \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " Apr 16 14:55:55.525960 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.525803 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-certificates\") pod \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " Apr 16 14:55:55.525960 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.525827 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-trusted-ca\") pod \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " Apr 16 14:55:55.525960 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.525854 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-image-registry-private-configuration\") pod \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " Apr 16 14:55:55.526250 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.526179 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-installation-pull-secrets\") pod \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\" (UID: \"4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac\") " Apr 16 14:55:55.526359 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.526319 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:55.526359 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.526326 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:55.526548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.526484 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-certificates\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:55:55.526548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.526509 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-trusted-ca\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:55:55.528292 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.528265 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:55.528444 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.528273 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-kube-api-access-5hcc4" (OuterVolumeSpecName: "kube-api-access-5hcc4") pod "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac"). InnerVolumeSpecName "kube-api-access-5hcc4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:55.528625 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.528593 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:55.528723 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.528659 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:55.528768 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.528731 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:55.534199 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.534177 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" (UID: "4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:55:55.627824 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.627802 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-ca-trust-extracted\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:55:55.627824 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.627822 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-bound-sa-token\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:55:55.627957 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.627832 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5hcc4\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-kube-api-access-5hcc4\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:55:55.627957 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.627843 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-image-registry-private-configuration\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:55:55.627957 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.627853 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-installation-pull-secrets\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:55:55.627957 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.627862 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac-registry-tls\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:55:55.758897 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.758871 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-58b5c4dfdf-8xt9b"] Apr 16 14:55:55.763305 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.763283 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-58b5c4dfdf-8xt9b"] Apr 16 14:55:55.862501 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:55.862475 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" path="/var/lib/kubelet/pods/4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac/volumes" Apr 16 14:55:56.682702 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:56.682671 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:56.683123 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:56.682713 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:56.687823 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:56.687801 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:57.447067 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:57.447042 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:55:57.497816 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:55:57.497791 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68b858fb9c-w6zpx"] Apr 16 14:56:04.615920 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:04.615848 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:56:04.615920 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:04.615934 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:56:22.517422 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.517366 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68b858fb9c-w6zpx" podUID="a6ce2ed2-9325-417e-aacb-678b806ca490" containerName="console" containerID="cri-o://17688ab8249cbed387fe1f1a7a3ae8d24231d0998967a21e058373f8b9f32e6f" gracePeriod=15 Apr 16 14:56:22.752403 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.752385 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68b858fb9c-w6zpx_a6ce2ed2-9325-417e-aacb-678b806ca490/console/0.log" Apr 16 14:56:22.752499 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.752451 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:56:22.824192 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.824174 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-service-ca\") pod \"a6ce2ed2-9325-417e-aacb-678b806ca490\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " Apr 16 14:56:22.824323 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.824224 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ce2ed2-9325-417e-aacb-678b806ca490-console-oauth-config\") pod \"a6ce2ed2-9325-417e-aacb-678b806ca490\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " Apr 16 14:56:22.824323 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.824259 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-console-config\") pod \"a6ce2ed2-9325-417e-aacb-678b806ca490\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " Apr 16 14:56:22.824323 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.824308 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjf79\" (UniqueName: \"kubernetes.io/projected/a6ce2ed2-9325-417e-aacb-678b806ca490-kube-api-access-wjf79\") pod \"a6ce2ed2-9325-417e-aacb-678b806ca490\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " Apr 16 14:56:22.824475 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.824335 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-oauth-serving-cert\") pod \"a6ce2ed2-9325-417e-aacb-678b806ca490\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " Apr 16 14:56:22.824475 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.824363 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ce2ed2-9325-417e-aacb-678b806ca490-console-serving-cert\") pod \"a6ce2ed2-9325-417e-aacb-678b806ca490\" (UID: \"a6ce2ed2-9325-417e-aacb-678b806ca490\") " Apr 16 14:56:22.824686 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.824648 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-service-ca" (OuterVolumeSpecName: "service-ca") pod "a6ce2ed2-9325-417e-aacb-678b806ca490" (UID: "a6ce2ed2-9325-417e-aacb-678b806ca490"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:22.824854 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.824814 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-console-config" (OuterVolumeSpecName: "console-config") pod "a6ce2ed2-9325-417e-aacb-678b806ca490" (UID: "a6ce2ed2-9325-417e-aacb-678b806ca490"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:22.827397 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.825600 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a6ce2ed2-9325-417e-aacb-678b806ca490" (UID: "a6ce2ed2-9325-417e-aacb-678b806ca490"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:22.829980 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.829011 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ce2ed2-9325-417e-aacb-678b806ca490-kube-api-access-wjf79" (OuterVolumeSpecName: "kube-api-access-wjf79") pod "a6ce2ed2-9325-417e-aacb-678b806ca490" (UID: "a6ce2ed2-9325-417e-aacb-678b806ca490"). InnerVolumeSpecName "kube-api-access-wjf79". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:56:22.831054 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.831021 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce2ed2-9325-417e-aacb-678b806ca490-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a6ce2ed2-9325-417e-aacb-678b806ca490" (UID: "a6ce2ed2-9325-417e-aacb-678b806ca490"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:22.831054 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.831044 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce2ed2-9325-417e-aacb-678b806ca490-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a6ce2ed2-9325-417e-aacb-678b806ca490" (UID: "a6ce2ed2-9325-417e-aacb-678b806ca490"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:22.924924 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.924881 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wjf79\" (UniqueName: \"kubernetes.io/projected/a6ce2ed2-9325-417e-aacb-678b806ca490-kube-api-access-wjf79\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:56:22.924924 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.924921 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-oauth-serving-cert\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:56:22.925038 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.924931 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ce2ed2-9325-417e-aacb-678b806ca490-console-serving-cert\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:56:22.925038 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.924940 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-service-ca\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:56:22.925038 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.924949 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6ce2ed2-9325-417e-aacb-678b806ca490-console-oauth-config\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:56:22.925038 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:22.924958 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6ce2ed2-9325-417e-aacb-678b806ca490-console-config\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:56:23.528990 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:23.528966 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68b858fb9c-w6zpx_a6ce2ed2-9325-417e-aacb-678b806ca490/console/0.log" Apr 16 14:56:23.529325 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:23.529005 2568 generic.go:358] "Generic (PLEG): container finished" podID="a6ce2ed2-9325-417e-aacb-678b806ca490" containerID="17688ab8249cbed387fe1f1a7a3ae8d24231d0998967a21e058373f8b9f32e6f" exitCode=2 Apr 16 14:56:23.529325 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:23.529041 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b858fb9c-w6zpx" event={"ID":"a6ce2ed2-9325-417e-aacb-678b806ca490","Type":"ContainerDied","Data":"17688ab8249cbed387fe1f1a7a3ae8d24231d0998967a21e058373f8b9f32e6f"} Apr 16 14:56:23.529325 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:23.529079 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b858fb9c-w6zpx" event={"ID":"a6ce2ed2-9325-417e-aacb-678b806ca490","Type":"ContainerDied","Data":"f3f83c649bcac90e68f16f7739fb6922d7778e6f78cd564dbb829e2ad173e04f"} Apr 16 14:56:23.529325 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:23.529096 2568 scope.go:117] "RemoveContainer" containerID="17688ab8249cbed387fe1f1a7a3ae8d24231d0998967a21e058373f8b9f32e6f" Apr 16 14:56:23.529325 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:23.529103 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b858fb9c-w6zpx" Apr 16 14:56:23.538546 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:23.538527 2568 scope.go:117] "RemoveContainer" containerID="17688ab8249cbed387fe1f1a7a3ae8d24231d0998967a21e058373f8b9f32e6f" Apr 16 14:56:23.538808 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:56:23.538789 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17688ab8249cbed387fe1f1a7a3ae8d24231d0998967a21e058373f8b9f32e6f\": container with ID starting with 17688ab8249cbed387fe1f1a7a3ae8d24231d0998967a21e058373f8b9f32e6f not found: ID does not exist" containerID="17688ab8249cbed387fe1f1a7a3ae8d24231d0998967a21e058373f8b9f32e6f" Apr 16 14:56:23.538872 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:23.538820 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17688ab8249cbed387fe1f1a7a3ae8d24231d0998967a21e058373f8b9f32e6f"} err="failed to get container status \"17688ab8249cbed387fe1f1a7a3ae8d24231d0998967a21e058373f8b9f32e6f\": rpc error: code = NotFound desc = could not find container \"17688ab8249cbed387fe1f1a7a3ae8d24231d0998967a21e058373f8b9f32e6f\": container with ID starting with 17688ab8249cbed387fe1f1a7a3ae8d24231d0998967a21e058373f8b9f32e6f not found: ID does not exist" Apr 16 14:56:23.549244 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:23.549221 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68b858fb9c-w6zpx"] Apr 16 14:56:23.554459 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:23.554433 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68b858fb9c-w6zpx"] Apr 16 14:56:23.863052 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:23.863023 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ce2ed2-9325-417e-aacb-678b806ca490" path="/var/lib/kubelet/pods/a6ce2ed2-9325-417e-aacb-678b806ca490/volumes" Apr 16 14:56:24.621420 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:24.621394 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:56:24.625354 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:24.625329 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-56dd544798-bkd2p" Apr 16 14:56:53.556385 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:53.556349 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:56:53.559003 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:53.558975 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b932e53d-9993-47b8-a2cb-940fc759370d-metrics-certs\") pod \"network-metrics-daemon-twkfq\" (UID: \"b932e53d-9993-47b8-a2cb-940fc759370d\") " pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:56:53.662567 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:53.662541 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lnt9d\"" Apr 16 14:56:53.670499 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:53.670480 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-twkfq" Apr 16 14:56:53.789028 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:53.789006 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-twkfq"] Apr 16 14:56:53.791089 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:56:53.791064 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb932e53d_9993_47b8_a2cb_940fc759370d.slice/crio-64a7f4aab8a1fc4698034cab75eae73f034eeb2d919cb40919184d15334a9497 WatchSource:0}: Error finding container 64a7f4aab8a1fc4698034cab75eae73f034eeb2d919cb40919184d15334a9497: Status 404 returned error can't find the container with id 64a7f4aab8a1fc4698034cab75eae73f034eeb2d919cb40919184d15334a9497 Apr 16 14:56:54.624107 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:54.624066 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-twkfq" event={"ID":"b932e53d-9993-47b8-a2cb-940fc759370d","Type":"ContainerStarted","Data":"64a7f4aab8a1fc4698034cab75eae73f034eeb2d919cb40919184d15334a9497"} Apr 16 14:56:55.629467 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:55.629434 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-twkfq" event={"ID":"b932e53d-9993-47b8-a2cb-940fc759370d","Type":"ContainerStarted","Data":"c4ca1e2dbedfa4037a79688b250abad913352f03f68698c0eca5dff7bee6d05c"} Apr 16 14:56:55.629467 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:55.629473 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-twkfq" event={"ID":"b932e53d-9993-47b8-a2cb-940fc759370d","Type":"ContainerStarted","Data":"e77125938ee9a149579eae8e783a5b7e00dc1796ee7d800a325f157e4f978cd6"} Apr 16 14:56:55.644875 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:56:55.644828 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-twkfq" podStartSLOduration=253.738852282 podStartE2EDuration="4m14.64481431s" podCreationTimestamp="2026-04-16 14:52:41 +0000 UTC" firstStartedPulling="2026-04-16 14:56:53.793162445 +0000 UTC m=+252.557114533" lastFinishedPulling="2026-04-16 14:56:54.699124472 +0000 UTC m=+253.463076561" observedRunningTime="2026-04-16 14:56:55.643631723 +0000 UTC m=+254.407583858" watchObservedRunningTime="2026-04-16 14:56:55.64481431 +0000 UTC m=+254.408766495" Apr 16 14:57:00.170976 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.170947 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:57:00.171396 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.171334 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="alertmanager" containerID="cri-o://8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac" gracePeriod=120 Apr 16 14:57:00.171464 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.171382 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="kube-rbac-proxy-metric" containerID="cri-o://9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3" gracePeriod=120 Apr 16 14:57:00.171464 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.171431 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="prom-label-proxy" containerID="cri-o://0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03" gracePeriod=120 Apr 16 14:57:00.171566 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.171406 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="kube-rbac-proxy-web" containerID="cri-o://5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7" gracePeriod=120 Apr 16 14:57:00.171566 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.171460 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="config-reloader" containerID="cri-o://aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61" gracePeriod=120 Apr 16 14:57:00.171663 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.171630 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="kube-rbac-proxy" containerID="cri-o://cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2" gracePeriod=120 Apr 16 14:57:00.654411 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.654374 2568 generic.go:358] "Generic (PLEG): container finished" podID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerID="0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03" exitCode=0 Apr 16 14:57:00.654411 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.654402 2568 generic.go:358] "Generic (PLEG): container finished" podID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerID="cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2" exitCode=0 Apr 16 14:57:00.654411 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.654410 2568 generic.go:358] "Generic (PLEG): container finished" podID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerID="aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61" exitCode=0 Apr 16 14:57:00.654611 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.654419 2568 generic.go:358] "Generic (PLEG): container finished" podID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerID="8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac" exitCode=0 Apr 16 14:57:00.654611 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.654442 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerDied","Data":"0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03"} Apr 16 14:57:00.654611 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.654474 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerDied","Data":"cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2"} Apr 16 14:57:00.654611 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.654486 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerDied","Data":"aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61"} Apr 16 14:57:00.654611 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:00.654495 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerDied","Data":"8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac"} Apr 16 14:57:01.417581 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.417563 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.514050 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.513988 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy-web\") pod \"271811e8-dfe5-4371-ae06-49ca4a0490fc\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " Apr 16 14:57:01.514050 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514017 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/271811e8-dfe5-4371-ae06-49ca4a0490fc-alertmanager-trusted-ca-bundle\") pod \"271811e8-dfe5-4371-ae06-49ca4a0490fc\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " Apr 16 14:57:01.514050 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514040 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-cluster-tls-config\") pod \"271811e8-dfe5-4371-ae06-49ca4a0490fc\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " Apr 16 14:57:01.514276 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514065 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/271811e8-dfe5-4371-ae06-49ca4a0490fc-metrics-client-ca\") pod \"271811e8-dfe5-4371-ae06-49ca4a0490fc\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " Apr 16 14:57:01.514276 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514194 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/271811e8-dfe5-4371-ae06-49ca4a0490fc-alertmanager-main-db\") pod \"271811e8-dfe5-4371-ae06-49ca4a0490fc\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " Apr 16 14:57:01.514276 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514262 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-main-tls\") pod \"271811e8-dfe5-4371-ae06-49ca4a0490fc\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " Apr 16 14:57:01.514425 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514302 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52z92\" (UniqueName: \"kubernetes.io/projected/271811e8-dfe5-4371-ae06-49ca4a0490fc-kube-api-access-52z92\") pod \"271811e8-dfe5-4371-ae06-49ca4a0490fc\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " Apr 16 14:57:01.514425 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514335 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-config-volume\") pod \"271811e8-dfe5-4371-ae06-49ca4a0490fc\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " Apr 16 14:57:01.514425 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514390 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/271811e8-dfe5-4371-ae06-49ca4a0490fc-config-out\") pod \"271811e8-dfe5-4371-ae06-49ca4a0490fc\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " Apr 16 14:57:01.514567 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514421 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"271811e8-dfe5-4371-ae06-49ca4a0490fc\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " Apr 16 14:57:01.514567 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514455 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy\") pod \"271811e8-dfe5-4371-ae06-49ca4a0490fc\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " Apr 16 14:57:01.514567 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514470 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/271811e8-dfe5-4371-ae06-49ca4a0490fc-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "271811e8-dfe5-4371-ae06-49ca4a0490fc" (UID: "271811e8-dfe5-4371-ae06-49ca4a0490fc"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:01.514567 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514482 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/271811e8-dfe5-4371-ae06-49ca4a0490fc-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "271811e8-dfe5-4371-ae06-49ca4a0490fc" (UID: "271811e8-dfe5-4371-ae06-49ca4a0490fc"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:57:01.514567 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514501 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-web-config\") pod \"271811e8-dfe5-4371-ae06-49ca4a0490fc\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " Apr 16 14:57:01.514828 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.514564 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/271811e8-dfe5-4371-ae06-49ca4a0490fc-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "271811e8-dfe5-4371-ae06-49ca4a0490fc" (UID: "271811e8-dfe5-4371-ae06-49ca4a0490fc"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:01.515183 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.515160 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/271811e8-dfe5-4371-ae06-49ca4a0490fc-tls-assets\") pod \"271811e8-dfe5-4371-ae06-49ca4a0490fc\" (UID: \"271811e8-dfe5-4371-ae06-49ca4a0490fc\") " Apr 16 14:57:01.515436 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.515414 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/271811e8-dfe5-4371-ae06-49ca4a0490fc-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:01.515503 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.515443 2568 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/271811e8-dfe5-4371-ae06-49ca4a0490fc-metrics-client-ca\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:01.515503 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.515460 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/271811e8-dfe5-4371-ae06-49ca4a0490fc-alertmanager-main-db\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:01.517430 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.517267 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "271811e8-dfe5-4371-ae06-49ca4a0490fc" (UID: "271811e8-dfe5-4371-ae06-49ca4a0490fc"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:01.517430 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.517268 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/271811e8-dfe5-4371-ae06-49ca4a0490fc-kube-api-access-52z92" (OuterVolumeSpecName: "kube-api-access-52z92") pod "271811e8-dfe5-4371-ae06-49ca4a0490fc" (UID: "271811e8-dfe5-4371-ae06-49ca4a0490fc"). InnerVolumeSpecName "kube-api-access-52z92". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:57:01.517596 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.517543 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "271811e8-dfe5-4371-ae06-49ca4a0490fc" (UID: "271811e8-dfe5-4371-ae06-49ca4a0490fc"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:01.517596 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.517562 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-config-volume" (OuterVolumeSpecName: "config-volume") pod "271811e8-dfe5-4371-ae06-49ca4a0490fc" (UID: "271811e8-dfe5-4371-ae06-49ca4a0490fc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:01.517710 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.517654 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "271811e8-dfe5-4371-ae06-49ca4a0490fc" (UID: "271811e8-dfe5-4371-ae06-49ca4a0490fc"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:01.517789 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.517764 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "271811e8-dfe5-4371-ae06-49ca4a0490fc" (UID: "271811e8-dfe5-4371-ae06-49ca4a0490fc"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:01.518533 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.518510 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/271811e8-dfe5-4371-ae06-49ca4a0490fc-config-out" (OuterVolumeSpecName: "config-out") pod "271811e8-dfe5-4371-ae06-49ca4a0490fc" (UID: "271811e8-dfe5-4371-ae06-49ca4a0490fc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:57:01.518961 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.518939 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/271811e8-dfe5-4371-ae06-49ca4a0490fc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "271811e8-dfe5-4371-ae06-49ca4a0490fc" (UID: "271811e8-dfe5-4371-ae06-49ca4a0490fc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:57:01.521487 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.521372 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "271811e8-dfe5-4371-ae06-49ca4a0490fc" (UID: "271811e8-dfe5-4371-ae06-49ca4a0490fc"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:01.528022 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.527996 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-web-config" (OuterVolumeSpecName: "web-config") pod "271811e8-dfe5-4371-ae06-49ca4a0490fc" (UID: "271811e8-dfe5-4371-ae06-49ca4a0490fc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:01.616287 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.616268 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-web-config\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:01.616287 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.616286 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/271811e8-dfe5-4371-ae06-49ca4a0490fc-tls-assets\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:01.616387 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.616295 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:01.616387 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.616304 2568 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-cluster-tls-config\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:01.616387 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.616314 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-main-tls\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:01.616387 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.616323 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-52z92\" (UniqueName: \"kubernetes.io/projected/271811e8-dfe5-4371-ae06-49ca4a0490fc-kube-api-access-52z92\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:01.616387 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.616332 2568 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-config-volume\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:01.616387 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.616341 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/271811e8-dfe5-4371-ae06-49ca4a0490fc-config-out\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:01.616387 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.616349 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:01.616387 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.616358 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/271811e8-dfe5-4371-ae06-49ca4a0490fc-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:01.660203 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.660180 2568 generic.go:358] "Generic (PLEG): container finished" podID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerID="9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3" exitCode=0 Apr 16 14:57:01.660203 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.660201 2568 generic.go:358] "Generic (PLEG): container finished" podID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerID="5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7" exitCode=0 Apr 16 14:57:01.660334 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.660249 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerDied","Data":"9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3"} Apr 16 14:57:01.660334 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.660277 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerDied","Data":"5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7"} Apr 16 14:57:01.660334 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.660287 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"271811e8-dfe5-4371-ae06-49ca4a0490fc","Type":"ContainerDied","Data":"72a02f18c82220b3823ec638d0911580d3c86e5c63db06d05169078b3c583927"} Apr 16 14:57:01.660334 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.660289 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.660334 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.660303 2568 scope.go:117] "RemoveContainer" containerID="0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03" Apr 16 14:57:01.668827 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.668756 2568 scope.go:117] "RemoveContainer" containerID="9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3" Apr 16 14:57:01.676828 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.676808 2568 scope.go:117] "RemoveContainer" containerID="cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2" Apr 16 14:57:01.683649 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.683635 2568 scope.go:117] "RemoveContainer" containerID="5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7" Apr 16 14:57:01.684011 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.683987 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:57:01.690334 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.690311 2568 scope.go:117] "RemoveContainer" containerID="aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61" Apr 16 14:57:01.691673 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.691655 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:57:01.696746 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.696728 2568 scope.go:117] "RemoveContainer" containerID="8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac" Apr 16 14:57:01.702869 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.702853 2568 scope.go:117] "RemoveContainer" containerID="1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480" Apr 16 14:57:01.708964 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.708934 2568 scope.go:117] "RemoveContainer" containerID="0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03" Apr 16 14:57:01.709208 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:57:01.709190 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03\": container with ID starting with 0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03 not found: ID does not exist" containerID="0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03" Apr 16 14:57:01.709266 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.709220 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03"} err="failed to get container status \"0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03\": rpc error: code = NotFound desc = could not find container \"0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03\": container with ID starting with 0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03 not found: ID does not exist" Apr 16 14:57:01.709266 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.709238 2568 scope.go:117] "RemoveContainer" containerID="9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3" Apr 16 14:57:01.709451 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:57:01.709437 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3\": container with ID starting with 9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3 not found: ID does not exist" containerID="9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3" Apr 16 14:57:01.709495 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.709456 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3"} err="failed to get container status \"9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3\": rpc error: code = NotFound desc = could not find container \"9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3\": container with ID starting with 9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3 not found: ID does not exist" Apr 16 14:57:01.709495 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.709469 2568 scope.go:117] "RemoveContainer" containerID="cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2" Apr 16 14:57:01.709652 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:57:01.709638 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2\": container with ID starting with cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2 not found: ID does not exist" containerID="cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2" Apr 16 14:57:01.709691 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.709654 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2"} err="failed to get container status \"cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2\": rpc error: code = NotFound desc = could not find container \"cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2\": container with ID starting with cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2 not found: ID does not exist" Apr 16 14:57:01.709691 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.709665 2568 scope.go:117] "RemoveContainer" containerID="5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7" Apr 16 14:57:01.709880 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:57:01.709866 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7\": container with ID starting with 5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7 not found: ID does not exist" containerID="5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7" Apr 16 14:57:01.709988 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.709883 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7"} err="failed to get container status \"5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7\": rpc error: code = NotFound desc = could not find container \"5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7\": container with ID starting with 5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7 not found: ID does not exist" Apr 16 14:57:01.709988 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.709911 2568 scope.go:117] "RemoveContainer" containerID="aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61" Apr 16 14:57:01.710103 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:57:01.710089 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61\": container with ID starting with aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61 not found: ID does not exist" containerID="aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61" Apr 16 14:57:01.710154 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.710105 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61"} err="failed to get container status \"aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61\": rpc error: code = NotFound desc = could not find container \"aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61\": container with ID starting with aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61 not found: ID does not exist" Apr 16 14:57:01.710154 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.710116 2568 scope.go:117] "RemoveContainer" containerID="8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac" Apr 16 14:57:01.710310 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:57:01.710296 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac\": container with ID starting with 8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac not found: ID does not exist" containerID="8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac" Apr 16 14:57:01.710342 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.710314 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac"} err="failed to get container status \"8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac\": rpc error: code = NotFound desc = could not find container \"8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac\": container with ID starting with 8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac not found: ID does not exist" Apr 16 14:57:01.710342 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.710325 2568 scope.go:117] "RemoveContainer" containerID="1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480" Apr 16 14:57:01.710512 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:57:01.710498 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480\": container with ID starting with 1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480 not found: ID does not exist" containerID="1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480" Apr 16 14:57:01.710555 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.710514 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480"} err="failed to get container status \"1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480\": rpc error: code = NotFound desc = could not find container \"1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480\": container with ID starting with 1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480 not found: ID does not exist" Apr 16 14:57:01.710555 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.710526 2568 scope.go:117] "RemoveContainer" containerID="0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03" Apr 16 14:57:01.710716 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.710701 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03"} err="failed to get container status \"0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03\": rpc error: code = NotFound desc = could not find container \"0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03\": container with ID starting with 0f9eadadcc2e61f071a50bf27ad1bf00c816358bce6c78be686ed2e8d08c8d03 not found: ID does not exist" Apr 16 14:57:01.710754 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.710716 2568 scope.go:117] "RemoveContainer" containerID="9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3" Apr 16 14:57:01.710949 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.710929 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3"} err="failed to get container status \"9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3\": rpc error: code = NotFound desc = could not find container \"9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3\": container with ID starting with 9d27ae32ff2581587a376dc48db2be54746ecda8c584522fdbecb9acd34224b3 not found: ID does not exist" Apr 16 14:57:01.711023 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.710951 2568 scope.go:117] "RemoveContainer" containerID="cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2" Apr 16 14:57:01.711167 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.711148 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2"} err="failed to get container status \"cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2\": rpc error: code = NotFound desc = could not find container \"cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2\": container with ID starting with cef4cc52e14de23c66a82adcc2179bdf56b3e50946067c7b8bda8d525c813eb2 not found: ID does not exist" Apr 16 14:57:01.711231 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.711168 2568 scope.go:117] "RemoveContainer" containerID="5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7" Apr 16 14:57:01.711377 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.711359 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7"} err="failed to get container status \"5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7\": rpc error: code = NotFound desc = could not find container \"5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7\": container with ID starting with 5742604d71c9783b23fbff65401867f71ca8d7eb22ebb94e150aa1fcd876c3e7 not found: ID does not exist" Apr 16 14:57:01.711426 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.711378 2568 scope.go:117] "RemoveContainer" containerID="aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61" Apr 16 14:57:01.711555 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.711540 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61"} err="failed to get container status \"aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61\": rpc error: code = NotFound desc = could not find container \"aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61\": container with ID starting with aa9d02ccd057c2d464e60390ca535a53541d73f19f569257c816e2b6af0b0b61 not found: ID does not exist" Apr 16 14:57:01.711555 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.711555 2568 scope.go:117] "RemoveContainer" containerID="8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac" Apr 16 14:57:01.711712 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.711694 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac"} err="failed to get container status \"8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac\": rpc error: code = NotFound desc = could not find container \"8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac\": container with ID starting with 8c6910f32197bfc5024806bbd416eed1c2fdee7741c42c751c453cc9f58015ac not found: ID does not exist" Apr 16 14:57:01.711768 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.711722 2568 scope.go:117] "RemoveContainer" containerID="1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480" Apr 16 14:57:01.711930 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.711910 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480"} err="failed to get container status \"1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480\": rpc error: code = NotFound desc = could not find container \"1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480\": container with ID starting with 1ccc56b3590e6dfc5f3a94f6884cead16cc9a50817d9827f7af4a8de8d847480 not found: ID does not exist" Apr 16 14:57:01.714204 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714186 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:57:01.714495 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714482 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="kube-rbac-proxy-metric" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714498 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="kube-rbac-proxy-metric" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714511 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="config-reloader" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714517 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="config-reloader" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714528 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" containerName="registry" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714533 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" containerName="registry" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714540 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="kube-rbac-proxy-web" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714545 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="kube-rbac-proxy-web" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714555 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="alertmanager" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714560 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="alertmanager" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714575 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6ce2ed2-9325-417e-aacb-678b806ca490" containerName="console" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714583 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ce2ed2-9325-417e-aacb-678b806ca490" containerName="console" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714590 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="init-config-reloader" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714595 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="init-config-reloader" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714600 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="kube-rbac-proxy" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714605 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="kube-rbac-proxy" Apr 16 14:57:01.714603 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714612 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="prom-label-proxy" Apr 16 14:57:01.715351 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714617 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="prom-label-proxy" Apr 16 14:57:01.715351 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714666 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6ce2ed2-9325-417e-aacb-678b806ca490" containerName="console" Apr 16 14:57:01.715351 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714676 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e28e16c-5b14-4b9f-8bf3-e89602d7b7ac" containerName="registry" Apr 16 14:57:01.715351 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714682 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="alertmanager" Apr 16 14:57:01.715351 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714688 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="kube-rbac-proxy-metric" Apr 16 14:57:01.715351 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714694 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="kube-rbac-proxy-web" Apr 16 14:57:01.715351 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714700 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="config-reloader" Apr 16 14:57:01.715351 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714706 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="prom-label-proxy" Apr 16 14:57:01.715351 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.714712 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" containerName="kube-rbac-proxy" Apr 16 14:57:01.718522 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.718506 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.722237 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.722220 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:57:01.722365 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.722238 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:57:01.722433 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.722382 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:57:01.722618 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.722598 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:57:01.722678 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.722616 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:57:01.722678 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.722627 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:57:01.722678 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.722673 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:57:01.722842 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.722601 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-mppqf\"" Apr 16 14:57:01.722842 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.722781 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:57:01.727497 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.727478 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:57:01.733373 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.733357 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:57:01.817386 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.817335 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-config-volume\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.817386 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.817368 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.817536 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.817417 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.817536 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.817455 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.817536 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.817521 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-config-out\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.817679 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.817546 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.817679 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.817571 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdm8d\" (UniqueName: \"kubernetes.io/projected/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-kube-api-access-pdm8d\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.817679 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.817600 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.817679 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.817662 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.817831 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.817691 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-web-config\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.817831 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.817723 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.817831 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.817748 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.817831 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.817791 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.862295 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.862273 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="271811e8-dfe5-4371-ae06-49ca4a0490fc" path="/var/lib/kubelet/pods/271811e8-dfe5-4371-ae06-49ca4a0490fc/volumes" Apr 16 14:57:01.919112 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919094 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.919217 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919128 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-config-out\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.919217 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919149 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.919217 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919191 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdm8d\" (UniqueName: \"kubernetes.io/projected/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-kube-api-access-pdm8d\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.919375 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919218 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.919375 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919244 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.919375 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919265 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-web-config\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.919653 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919625 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.919721 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919684 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.919778 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919727 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.919778 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919758 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-config-volume\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.919778 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919765 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.919947 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919787 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.919947 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.919842 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.920822 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.920466 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.921418 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.921389 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.922100 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.922060 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.922493 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.922446 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.922642 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.922616 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.922715 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.922702 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-config-out\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.922798 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.922781 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.923122 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.923100 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-web-config\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.923323 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.923307 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.923675 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.923657 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.924167 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.924151 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-config-volume\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:01.927427 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:01.927392 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdm8d\" (UniqueName: \"kubernetes.io/projected/9c73095c-fd5b-46c8-9ad4-b0323de86a2b-kube-api-access-pdm8d\") pod \"alertmanager-main-0\" (UID: \"9c73095c-fd5b-46c8-9ad4-b0323de86a2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:02.027946 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:02.027925 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:57:02.151551 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:02.151525 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:57:02.154471 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:57:02.154444 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c73095c_fd5b_46c8_9ad4_b0323de86a2b.slice/crio-75ea17df6dbbaf4749b565af1317fd70e1cf148307072cecb20b3d304c71fb7f WatchSource:0}: Error finding container 75ea17df6dbbaf4749b565af1317fd70e1cf148307072cecb20b3d304c71fb7f: Status 404 returned error can't find the container with id 75ea17df6dbbaf4749b565af1317fd70e1cf148307072cecb20b3d304c71fb7f Apr 16 14:57:02.665951 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:02.665867 2568 generic.go:358] "Generic (PLEG): container finished" podID="9c73095c-fd5b-46c8-9ad4-b0323de86a2b" containerID="1f505e166c962c8ebe40e210938e3b15c5e69602b10d9d44cb2ade5bbb339b5c" exitCode=0 Apr 16 14:57:02.666311 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:02.665930 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c73095c-fd5b-46c8-9ad4-b0323de86a2b","Type":"ContainerDied","Data":"1f505e166c962c8ebe40e210938e3b15c5e69602b10d9d44cb2ade5bbb339b5c"} Apr 16 14:57:02.666311 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:02.666006 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c73095c-fd5b-46c8-9ad4-b0323de86a2b","Type":"ContainerStarted","Data":"75ea17df6dbbaf4749b565af1317fd70e1cf148307072cecb20b3d304c71fb7f"} Apr 16 14:57:03.673139 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:03.673102 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c73095c-fd5b-46c8-9ad4-b0323de86a2b","Type":"ContainerStarted","Data":"94c1afae23e16d8844f757382619b1496a0c55500e52336dabdedb4d0a170eba"} Apr 16 14:57:03.673139 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:03.673141 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c73095c-fd5b-46c8-9ad4-b0323de86a2b","Type":"ContainerStarted","Data":"4337258b30a441ff00ed6101f8db96ea63c0f5c360b249054039a938a803de04"} Apr 16 14:57:03.673564 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:03.673158 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c73095c-fd5b-46c8-9ad4-b0323de86a2b","Type":"ContainerStarted","Data":"18f14fa8609d296efc05f2384bd33f832339941f9d3d1ad90d9c02ce2c5cfc91"} Apr 16 14:57:03.673564 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:03.673169 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c73095c-fd5b-46c8-9ad4-b0323de86a2b","Type":"ContainerStarted","Data":"82f508d0a87e63a7d66a095b4b762d1e2da9ee7f45a71962218e6d47871c7437"} Apr 16 14:57:03.673564 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:03.673181 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c73095c-fd5b-46c8-9ad4-b0323de86a2b","Type":"ContainerStarted","Data":"224f70b08c210ea1d744aae7cf556060aa7adeb31159ffce9554256deb25970a"} Apr 16 14:57:03.673564 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:03.673194 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c73095c-fd5b-46c8-9ad4-b0323de86a2b","Type":"ContainerStarted","Data":"7cc74ff14b83f872f40120ba38b07da5ac0c77057742a30edb741d4a6845a6db"} Apr 16 14:57:03.701041 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:03.700997 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.700981905 podStartE2EDuration="2.700981905s" podCreationTimestamp="2026-04-16 14:57:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:57:03.698386235 +0000 UTC m=+262.462338345" watchObservedRunningTime="2026-04-16 14:57:03.700981905 +0000 UTC m=+262.464934015" Apr 16 14:57:04.196231 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.196195 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5d77499596-rfjtq"] Apr 16 14:57:04.200751 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.200719 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.204347 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.204296 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 14:57:04.204499 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.204330 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-zzdzg\"" Apr 16 14:57:04.204499 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.204485 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 14:57:04.204499 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.204358 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 14:57:04.204678 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.204384 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 14:57:04.204678 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.204397 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 14:57:04.208850 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.208823 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 14:57:04.213061 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.213038 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d77499596-rfjtq"] Apr 16 14:57:04.339672 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.339637 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/834dd624-de28-453c-bec0-8f6bf779e2c4-federate-client-tls\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.339786 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.339682 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/834dd624-de28-453c-bec0-8f6bf779e2c4-secret-telemeter-client\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.339786 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.339701 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/834dd624-de28-453c-bec0-8f6bf779e2c4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.339786 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.339721 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/834dd624-de28-453c-bec0-8f6bf779e2c4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.339786 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.339774 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/834dd624-de28-453c-bec0-8f6bf779e2c4-metrics-client-ca\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.339963 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.339817 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm6s2\" (UniqueName: \"kubernetes.io/projected/834dd624-de28-453c-bec0-8f6bf779e2c4-kube-api-access-jm6s2\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.339963 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.339872 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/834dd624-de28-453c-bec0-8f6bf779e2c4-serving-certs-ca-bundle\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.339963 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.339927 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/834dd624-de28-453c-bec0-8f6bf779e2c4-telemeter-client-tls\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.440537 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.440503 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/834dd624-de28-453c-bec0-8f6bf779e2c4-federate-client-tls\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.440660 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.440551 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/834dd624-de28-453c-bec0-8f6bf779e2c4-secret-telemeter-client\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.440660 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.440576 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/834dd624-de28-453c-bec0-8f6bf779e2c4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.440660 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.440606 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/834dd624-de28-453c-bec0-8f6bf779e2c4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.440660 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.440650 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/834dd624-de28-453c-bec0-8f6bf779e2c4-metrics-client-ca\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.440839 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.440709 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jm6s2\" (UniqueName: \"kubernetes.io/projected/834dd624-de28-453c-bec0-8f6bf779e2c4-kube-api-access-jm6s2\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.440839 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.440795 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/834dd624-de28-453c-bec0-8f6bf779e2c4-serving-certs-ca-bundle\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.441070 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.441043 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/834dd624-de28-453c-bec0-8f6bf779e2c4-telemeter-client-tls\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.441598 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.441559 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/834dd624-de28-453c-bec0-8f6bf779e2c4-serving-certs-ca-bundle\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.441696 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.441578 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/834dd624-de28-453c-bec0-8f6bf779e2c4-metrics-client-ca\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.441696 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.441629 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/834dd624-de28-453c-bec0-8f6bf779e2c4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.443970 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.443948 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/834dd624-de28-453c-bec0-8f6bf779e2c4-secret-telemeter-client\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.444094 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.444072 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/834dd624-de28-453c-bec0-8f6bf779e2c4-federate-client-tls\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.444169 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.444144 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/834dd624-de28-453c-bec0-8f6bf779e2c4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.444217 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.444172 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/834dd624-de28-453c-bec0-8f6bf779e2c4-telemeter-client-tls\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.448861 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.448795 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm6s2\" (UniqueName: \"kubernetes.io/projected/834dd624-de28-453c-bec0-8f6bf779e2c4-kube-api-access-jm6s2\") pod \"telemeter-client-5d77499596-rfjtq\" (UID: \"834dd624-de28-453c-bec0-8f6bf779e2c4\") " pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.514035 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.514006 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" Apr 16 14:57:04.633839 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.633767 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d77499596-rfjtq"] Apr 16 14:57:04.637070 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:57:04.637042 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod834dd624_de28_453c_bec0_8f6bf779e2c4.slice/crio-ee69c1120d4a2dd66422461d40d75811a8e0f3beeaec8d452372513d66ac5a4f WatchSource:0}: Error finding container ee69c1120d4a2dd66422461d40d75811a8e0f3beeaec8d452372513d66ac5a4f: Status 404 returned error can't find the container with id ee69c1120d4a2dd66422461d40d75811a8e0f3beeaec8d452372513d66ac5a4f Apr 16 14:57:04.676961 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:04.676930 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" event={"ID":"834dd624-de28-453c-bec0-8f6bf779e2c4","Type":"ContainerStarted","Data":"ee69c1120d4a2dd66422461d40d75811a8e0f3beeaec8d452372513d66ac5a4f"} Apr 16 14:57:06.685263 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:06.685224 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" event={"ID":"834dd624-de28-453c-bec0-8f6bf779e2c4","Type":"ContainerStarted","Data":"f63ceb2445397e56599827ce91292dd800b391e5f01fc99ec5e0913765e747cd"} Apr 16 14:57:06.685675 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:06.685268 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" event={"ID":"834dd624-de28-453c-bec0-8f6bf779e2c4","Type":"ContainerStarted","Data":"2d2f132dda56a485365b22f7f4d030f4e7305e6d913c5d92058ab93cdeb5639a"} Apr 16 14:57:06.685675 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:06.685285 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" event={"ID":"834dd624-de28-453c-bec0-8f6bf779e2c4","Type":"ContainerStarted","Data":"ae80bf96c8ae2904204bc41477001b90fea65c574f7b2b588678c03cde48df34"} Apr 16 14:57:06.708048 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:06.707991 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5d77499596-rfjtq" podStartSLOduration=1.379904946 podStartE2EDuration="2.707971495s" podCreationTimestamp="2026-04-16 14:57:04 +0000 UTC" firstStartedPulling="2026-04-16 14:57:04.638882285 +0000 UTC m=+263.402834374" lastFinishedPulling="2026-04-16 14:57:05.966948834 +0000 UTC m=+264.730900923" observedRunningTime="2026-04-16 14:57:06.707300075 +0000 UTC m=+265.471252242" watchObservedRunningTime="2026-04-16 14:57:06.707971495 +0000 UTC m=+265.471923607" Apr 16 14:57:07.752956 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.752925 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76c9cc77c7-mghd6"] Apr 16 14:57:07.755387 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.755366 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.772801 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.772776 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c9cc77c7-mghd6"] Apr 16 14:57:07.871712 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.871692 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-trusted-ca-bundle\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.871801 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.871728 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-serving-cert\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.871801 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.871746 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55rlx\" (UniqueName: \"kubernetes.io/projected/e529091f-ecab-4e5f-bc3f-7432720ba8e4-kube-api-access-55rlx\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.871801 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.871768 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-service-ca\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.871906 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.871813 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-config\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.871906 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.871828 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-oauth-serving-cert\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.871906 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.871847 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-oauth-config\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.972657 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.972622 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-serving-cert\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.972657 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.972659 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55rlx\" (UniqueName: \"kubernetes.io/projected/e529091f-ecab-4e5f-bc3f-7432720ba8e4-kube-api-access-55rlx\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.972789 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.972684 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-service-ca\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.972822 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.972805 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-config\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.972856 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.972844 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-oauth-serving-cert\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.972940 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.972879 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-oauth-config\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.973002 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.972951 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-trusted-ca-bundle\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.973401 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.973378 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-service-ca\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.973491 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.973453 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-config\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.973671 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.973647 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-oauth-serving-cert\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.973734 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.973719 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-trusted-ca-bundle\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.975230 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.975204 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-oauth-config\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.975295 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.975281 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-serving-cert\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:07.984121 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:07.984096 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55rlx\" (UniqueName: \"kubernetes.io/projected/e529091f-ecab-4e5f-bc3f-7432720ba8e4-kube-api-access-55rlx\") pod \"console-76c9cc77c7-mghd6\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:08.064826 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:08.064778 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:08.185744 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:08.185720 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c9cc77c7-mghd6"] Apr 16 14:57:08.188680 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:57:08.188657 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode529091f_ecab_4e5f_bc3f_7432720ba8e4.slice/crio-6bee6b28b066c322ae3ab6b1340784325b9c333d2f4d174ff43c83bd91a4cfc6 WatchSource:0}: Error finding container 6bee6b28b066c322ae3ab6b1340784325b9c333d2f4d174ff43c83bd91a4cfc6: Status 404 returned error can't find the container with id 6bee6b28b066c322ae3ab6b1340784325b9c333d2f4d174ff43c83bd91a4cfc6 Apr 16 14:57:08.694972 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:08.694928 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c9cc77c7-mghd6" event={"ID":"e529091f-ecab-4e5f-bc3f-7432720ba8e4","Type":"ContainerStarted","Data":"42eefe754f330ea9d6d86e5c93d92a47f15ad9df8d1b1244399298f9b4e61c52"} Apr 16 14:57:08.694972 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:08.694975 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c9cc77c7-mghd6" event={"ID":"e529091f-ecab-4e5f-bc3f-7432720ba8e4","Type":"ContainerStarted","Data":"6bee6b28b066c322ae3ab6b1340784325b9c333d2f4d174ff43c83bd91a4cfc6"} Apr 16 14:57:08.711729 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:08.711685 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76c9cc77c7-mghd6" podStartSLOduration=1.711671139 podStartE2EDuration="1.711671139s" podCreationTimestamp="2026-04-16 14:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:57:08.7092964 +0000 UTC m=+267.473248511" watchObservedRunningTime="2026-04-16 14:57:08.711671139 +0000 UTC m=+267.475623251" Apr 16 14:57:18.065510 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:18.065470 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:18.065510 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:18.065510 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:18.070201 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:18.070179 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:18.731926 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:18.731843 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:57:18.771838 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:18.771811 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bbb454cc-k9vzf"] Apr 16 14:57:41.744201 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:41.744176 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 14:57:41.744201 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:41.744190 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 14:57:41.747800 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:41.747782 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:57:43.791021 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:43.790953 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7bbb454cc-k9vzf" podUID="f91cc132-1c05-4c85-a902-2b8622573dd1" containerName="console" containerID="cri-o://73d0e2b85bc614be6f669464c519c8f2f2092bfa6504378fd209cfe9abb72001" gracePeriod=15 Apr 16 14:57:44.030106 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.030085 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bbb454cc-k9vzf_f91cc132-1c05-4c85-a902-2b8622573dd1/console/0.log" Apr 16 14:57:44.030196 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.030143 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:57:44.225201 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.225180 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-oauth-serving-cert\") pod \"f91cc132-1c05-4c85-a902-2b8622573dd1\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " Apr 16 14:57:44.225340 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.225217 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f91cc132-1c05-4c85-a902-2b8622573dd1-console-serving-cert\") pod \"f91cc132-1c05-4c85-a902-2b8622573dd1\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " Apr 16 14:57:44.225340 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.225252 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-service-ca\") pod \"f91cc132-1c05-4c85-a902-2b8622573dd1\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " Apr 16 14:57:44.225340 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.225273 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-trusted-ca-bundle\") pod \"f91cc132-1c05-4c85-a902-2b8622573dd1\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " Apr 16 14:57:44.225496 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.225351 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjqmv\" (UniqueName: \"kubernetes.io/projected/f91cc132-1c05-4c85-a902-2b8622573dd1-kube-api-access-tjqmv\") pod \"f91cc132-1c05-4c85-a902-2b8622573dd1\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " Apr 16 14:57:44.225496 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.225404 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f91cc132-1c05-4c85-a902-2b8622573dd1-console-oauth-config\") pod \"f91cc132-1c05-4c85-a902-2b8622573dd1\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " Apr 16 14:57:44.225599 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.225515 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-console-config\") pod \"f91cc132-1c05-4c85-a902-2b8622573dd1\" (UID: \"f91cc132-1c05-4c85-a902-2b8622573dd1\") " Apr 16 14:57:44.225686 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.225659 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-service-ca" (OuterVolumeSpecName: "service-ca") pod "f91cc132-1c05-4c85-a902-2b8622573dd1" (UID: "f91cc132-1c05-4c85-a902-2b8622573dd1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:44.225747 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.225655 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f91cc132-1c05-4c85-a902-2b8622573dd1" (UID: "f91cc132-1c05-4c85-a902-2b8622573dd1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:44.225795 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.225735 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f91cc132-1c05-4c85-a902-2b8622573dd1" (UID: "f91cc132-1c05-4c85-a902-2b8622573dd1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:44.226048 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.226025 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-console-config" (OuterVolumeSpecName: "console-config") pod "f91cc132-1c05-4c85-a902-2b8622573dd1" (UID: "f91cc132-1c05-4c85-a902-2b8622573dd1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:44.226104 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.226057 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-service-ca\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:44.226104 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.226093 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-trusted-ca-bundle\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:44.226175 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.226109 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-oauth-serving-cert\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:44.227851 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.227828 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91cc132-1c05-4c85-a902-2b8622573dd1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f91cc132-1c05-4c85-a902-2b8622573dd1" (UID: "f91cc132-1c05-4c85-a902-2b8622573dd1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:44.227988 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.227927 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91cc132-1c05-4c85-a902-2b8622573dd1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f91cc132-1c05-4c85-a902-2b8622573dd1" (UID: "f91cc132-1c05-4c85-a902-2b8622573dd1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:44.228134 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.228113 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91cc132-1c05-4c85-a902-2b8622573dd1-kube-api-access-tjqmv" (OuterVolumeSpecName: "kube-api-access-tjqmv") pod "f91cc132-1c05-4c85-a902-2b8622573dd1" (UID: "f91cc132-1c05-4c85-a902-2b8622573dd1"). InnerVolumeSpecName "kube-api-access-tjqmv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:57:44.326871 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.326847 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tjqmv\" (UniqueName: \"kubernetes.io/projected/f91cc132-1c05-4c85-a902-2b8622573dd1-kube-api-access-tjqmv\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:44.326871 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.326870 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f91cc132-1c05-4c85-a902-2b8622573dd1-console-oauth-config\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:44.327015 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.326882 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f91cc132-1c05-4c85-a902-2b8622573dd1-console-config\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:44.327015 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.326911 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f91cc132-1c05-4c85-a902-2b8622573dd1-console-serving-cert\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:57:44.810998 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.810976 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bbb454cc-k9vzf_f91cc132-1c05-4c85-a902-2b8622573dd1/console/0.log" Apr 16 14:57:44.811405 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.811016 2568 generic.go:358] "Generic (PLEG): container finished" podID="f91cc132-1c05-4c85-a902-2b8622573dd1" containerID="73d0e2b85bc614be6f669464c519c8f2f2092bfa6504378fd209cfe9abb72001" exitCode=2 Apr 16 14:57:44.811405 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.811108 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bbb454cc-k9vzf" event={"ID":"f91cc132-1c05-4c85-a902-2b8622573dd1","Type":"ContainerDied","Data":"73d0e2b85bc614be6f669464c519c8f2f2092bfa6504378fd209cfe9abb72001"} Apr 16 14:57:44.811405 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.811129 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bbb454cc-k9vzf" Apr 16 14:57:44.811405 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.811167 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bbb454cc-k9vzf" event={"ID":"f91cc132-1c05-4c85-a902-2b8622573dd1","Type":"ContainerDied","Data":"9fb283b08261136b98b30e9e4f935c7e9c390a4b1af3d2e8315d54e5ec7df137"} Apr 16 14:57:44.811405 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.811190 2568 scope.go:117] "RemoveContainer" containerID="73d0e2b85bc614be6f669464c519c8f2f2092bfa6504378fd209cfe9abb72001" Apr 16 14:57:44.820843 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.820816 2568 scope.go:117] "RemoveContainer" containerID="73d0e2b85bc614be6f669464c519c8f2f2092bfa6504378fd209cfe9abb72001" Apr 16 14:57:44.821133 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:57:44.821113 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d0e2b85bc614be6f669464c519c8f2f2092bfa6504378fd209cfe9abb72001\": container with ID starting with 73d0e2b85bc614be6f669464c519c8f2f2092bfa6504378fd209cfe9abb72001 not found: ID does not exist" containerID="73d0e2b85bc614be6f669464c519c8f2f2092bfa6504378fd209cfe9abb72001" Apr 16 14:57:44.821194 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.821141 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d0e2b85bc614be6f669464c519c8f2f2092bfa6504378fd209cfe9abb72001"} err="failed to get container status \"73d0e2b85bc614be6f669464c519c8f2f2092bfa6504378fd209cfe9abb72001\": rpc error: code = NotFound desc = could not find container \"73d0e2b85bc614be6f669464c519c8f2f2092bfa6504378fd209cfe9abb72001\": container with ID starting with 73d0e2b85bc614be6f669464c519c8f2f2092bfa6504378fd209cfe9abb72001 not found: ID does not exist" Apr 16 14:57:44.832352 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.832294 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bbb454cc-k9vzf"] Apr 16 14:57:44.834674 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:44.834649 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7bbb454cc-k9vzf"] Apr 16 14:57:45.867807 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:57:45.867778 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91cc132-1c05-4c85-a902-2b8622573dd1" path="/var/lib/kubelet/pods/f91cc132-1c05-4c85-a902-2b8622573dd1/volumes" Apr 16 14:58:25.242577 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.242547 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84769dbb7f-vqdpj"] Apr 16 14:58:25.243007 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.242861 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f91cc132-1c05-4c85-a902-2b8622573dd1" containerName="console" Apr 16 14:58:25.243007 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.242873 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91cc132-1c05-4c85-a902-2b8622573dd1" containerName="console" Apr 16 14:58:25.243007 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.242950 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f91cc132-1c05-4c85-a902-2b8622573dd1" containerName="console" Apr 16 14:58:25.247105 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.247086 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.254145 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.253845 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84769dbb7f-vqdpj"] Apr 16 14:58:25.304147 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.304128 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-service-ca\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.304244 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.304155 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-trusted-ca-bundle\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.304244 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.304178 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-console-config\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.304244 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.304225 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/884e3635-5dae-45b5-bb62-e9806440468c-console-oauth-config\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.304343 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.304251 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vzlw\" (UniqueName: \"kubernetes.io/projected/884e3635-5dae-45b5-bb62-e9806440468c-kube-api-access-8vzlw\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.304343 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.304326 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-oauth-serving-cert\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.304445 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.304356 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/884e3635-5dae-45b5-bb62-e9806440468c-console-serving-cert\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.405562 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.405534 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-console-config\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.405683 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.405568 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/884e3635-5dae-45b5-bb62-e9806440468c-console-oauth-config\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.405683 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.405592 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vzlw\" (UniqueName: \"kubernetes.io/projected/884e3635-5dae-45b5-bb62-e9806440468c-kube-api-access-8vzlw\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.405683 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.405627 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-oauth-serving-cert\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.405683 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.405658 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/884e3635-5dae-45b5-bb62-e9806440468c-console-serving-cert\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.405923 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.405687 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-service-ca\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.405923 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.405707 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-trusted-ca-bundle\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.406426 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.406395 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-console-config\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.406548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.406484 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-service-ca\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.406548 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.406484 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-oauth-serving-cert\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.406666 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.406631 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-trusted-ca-bundle\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.408299 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.408267 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/884e3635-5dae-45b5-bb62-e9806440468c-console-oauth-config\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.408394 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.408382 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/884e3635-5dae-45b5-bb62-e9806440468c-console-serving-cert\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.412856 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.412835 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vzlw\" (UniqueName: \"kubernetes.io/projected/884e3635-5dae-45b5-bb62-e9806440468c-kube-api-access-8vzlw\") pod \"console-84769dbb7f-vqdpj\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.557123 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.557054 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:25.677272 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.677251 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84769dbb7f-vqdpj"] Apr 16 14:58:25.679124 ip-10-0-140-83 kubenswrapper[2568]: W0416 14:58:25.679091 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod884e3635_5dae_45b5_bb62_e9806440468c.slice/crio-063b6b48c55335e08f1a7d5cff6ab37b788daec05aff0935e7bb90a4b601ffae WatchSource:0}: Error finding container 063b6b48c55335e08f1a7d5cff6ab37b788daec05aff0935e7bb90a4b601ffae: Status 404 returned error can't find the container with id 063b6b48c55335e08f1a7d5cff6ab37b788daec05aff0935e7bb90a4b601ffae Apr 16 14:58:25.680945 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.680929 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:58:25.938140 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.938106 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84769dbb7f-vqdpj" event={"ID":"884e3635-5dae-45b5-bb62-e9806440468c","Type":"ContainerStarted","Data":"13907b7ba80c35d61663ad618ffad9fb2debac634ffa577a59408c6ca21847aa"} Apr 16 14:58:25.938140 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.938142 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84769dbb7f-vqdpj" event={"ID":"884e3635-5dae-45b5-bb62-e9806440468c","Type":"ContainerStarted","Data":"063b6b48c55335e08f1a7d5cff6ab37b788daec05aff0935e7bb90a4b601ffae"} Apr 16 14:58:25.954407 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:25.954368 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84769dbb7f-vqdpj" podStartSLOduration=0.954353246 podStartE2EDuration="954.353246ms" podCreationTimestamp="2026-04-16 14:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:58:25.952791478 +0000 UTC m=+344.716743590" watchObservedRunningTime="2026-04-16 14:58:25.954353246 +0000 UTC m=+344.718305358" Apr 16 14:58:35.558172 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:35.558090 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:35.558172 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:35.558132 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:35.562762 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:35.562737 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:35.970869 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:35.970843 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 14:58:36.016325 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:58:36.016300 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c9cc77c7-mghd6"] Apr 16 14:59:01.036275 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.036225 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76c9cc77c7-mghd6" podUID="e529091f-ecab-4e5f-bc3f-7432720ba8e4" containerName="console" containerID="cri-o://42eefe754f330ea9d6d86e5c93d92a47f15ad9df8d1b1244399298f9b4e61c52" gracePeriod=15 Apr 16 14:59:01.273609 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.273590 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c9cc77c7-mghd6_e529091f-ecab-4e5f-bc3f-7432720ba8e4/console/0.log" Apr 16 14:59:01.273712 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.273646 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:59:01.360261 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360232 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-trusted-ca-bundle\") pod \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " Apr 16 14:59:01.360385 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360286 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-config\") pod \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " Apr 16 14:59:01.360385 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360320 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-oauth-config\") pod \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " Apr 16 14:59:01.360385 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360353 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-service-ca\") pod \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " Apr 16 14:59:01.360538 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360402 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55rlx\" (UniqueName: \"kubernetes.io/projected/e529091f-ecab-4e5f-bc3f-7432720ba8e4-kube-api-access-55rlx\") pod \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " Apr 16 14:59:01.360538 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360431 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-oauth-serving-cert\") pod \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " Apr 16 14:59:01.360641 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360564 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-serving-cert\") pod \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\" (UID: \"e529091f-ecab-4e5f-bc3f-7432720ba8e4\") " Apr 16 14:59:01.360742 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360715 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-config" (OuterVolumeSpecName: "console-config") pod "e529091f-ecab-4e5f-bc3f-7432720ba8e4" (UID: "e529091f-ecab-4e5f-bc3f-7432720ba8e4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:59:01.360851 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360655 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e529091f-ecab-4e5f-bc3f-7432720ba8e4" (UID: "e529091f-ecab-4e5f-bc3f-7432720ba8e4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:59:01.360851 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360761 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-service-ca" (OuterVolumeSpecName: "service-ca") pod "e529091f-ecab-4e5f-bc3f-7432720ba8e4" (UID: "e529091f-ecab-4e5f-bc3f-7432720ba8e4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:59:01.360851 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360843 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e529091f-ecab-4e5f-bc3f-7432720ba8e4" (UID: "e529091f-ecab-4e5f-bc3f-7432720ba8e4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:59:01.361056 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360863 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-service-ca\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:59:01.361056 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360877 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-trusted-ca-bundle\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:59:01.361056 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.360910 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-config\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:59:01.362489 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.362460 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e529091f-ecab-4e5f-bc3f-7432720ba8e4" (UID: "e529091f-ecab-4e5f-bc3f-7432720ba8e4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:59:01.362574 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.362503 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e529091f-ecab-4e5f-bc3f-7432720ba8e4" (UID: "e529091f-ecab-4e5f-bc3f-7432720ba8e4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:59:01.362627 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.362603 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e529091f-ecab-4e5f-bc3f-7432720ba8e4-kube-api-access-55rlx" (OuterVolumeSpecName: "kube-api-access-55rlx") pod "e529091f-ecab-4e5f-bc3f-7432720ba8e4" (UID: "e529091f-ecab-4e5f-bc3f-7432720ba8e4"). InnerVolumeSpecName "kube-api-access-55rlx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:59:01.461318 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.461292 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-oauth-config\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:59:01.461318 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.461315 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-55rlx\" (UniqueName: \"kubernetes.io/projected/e529091f-ecab-4e5f-bc3f-7432720ba8e4-kube-api-access-55rlx\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:59:01.461440 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.461324 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e529091f-ecab-4e5f-bc3f-7432720ba8e4-oauth-serving-cert\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:59:01.461440 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:01.461334 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e529091f-ecab-4e5f-bc3f-7432720ba8e4-console-serving-cert\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 14:59:02.051094 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:02.051073 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c9cc77c7-mghd6_e529091f-ecab-4e5f-bc3f-7432720ba8e4/console/0.log" Apr 16 14:59:02.051401 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:02.051109 2568 generic.go:358] "Generic (PLEG): container finished" podID="e529091f-ecab-4e5f-bc3f-7432720ba8e4" containerID="42eefe754f330ea9d6d86e5c93d92a47f15ad9df8d1b1244399298f9b4e61c52" exitCode=2 Apr 16 14:59:02.051401 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:02.051137 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c9cc77c7-mghd6" event={"ID":"e529091f-ecab-4e5f-bc3f-7432720ba8e4","Type":"ContainerDied","Data":"42eefe754f330ea9d6d86e5c93d92a47f15ad9df8d1b1244399298f9b4e61c52"} Apr 16 14:59:02.051401 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:02.051157 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c9cc77c7-mghd6" event={"ID":"e529091f-ecab-4e5f-bc3f-7432720ba8e4","Type":"ContainerDied","Data":"6bee6b28b066c322ae3ab6b1340784325b9c333d2f4d174ff43c83bd91a4cfc6"} Apr 16 14:59:02.051401 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:02.051170 2568 scope.go:117] "RemoveContainer" containerID="42eefe754f330ea9d6d86e5c93d92a47f15ad9df8d1b1244399298f9b4e61c52" Apr 16 14:59:02.051401 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:02.051170 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c9cc77c7-mghd6" Apr 16 14:59:02.061256 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:02.059764 2568 scope.go:117] "RemoveContainer" containerID="42eefe754f330ea9d6d86e5c93d92a47f15ad9df8d1b1244399298f9b4e61c52" Apr 16 14:59:02.061970 ip-10-0-140-83 kubenswrapper[2568]: E0416 14:59:02.061949 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42eefe754f330ea9d6d86e5c93d92a47f15ad9df8d1b1244399298f9b4e61c52\": container with ID starting with 42eefe754f330ea9d6d86e5c93d92a47f15ad9df8d1b1244399298f9b4e61c52 not found: ID does not exist" containerID="42eefe754f330ea9d6d86e5c93d92a47f15ad9df8d1b1244399298f9b4e61c52" Apr 16 14:59:02.062065 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:02.061979 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42eefe754f330ea9d6d86e5c93d92a47f15ad9df8d1b1244399298f9b4e61c52"} err="failed to get container status \"42eefe754f330ea9d6d86e5c93d92a47f15ad9df8d1b1244399298f9b4e61c52\": rpc error: code = NotFound desc = could not find container \"42eefe754f330ea9d6d86e5c93d92a47f15ad9df8d1b1244399298f9b4e61c52\": container with ID starting with 42eefe754f330ea9d6d86e5c93d92a47f15ad9df8d1b1244399298f9b4e61c52 not found: ID does not exist" Apr 16 14:59:02.067970 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:02.067948 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c9cc77c7-mghd6"] Apr 16 14:59:02.071556 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:02.071532 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76c9cc77c7-mghd6"] Apr 16 14:59:03.864213 ip-10-0-140-83 kubenswrapper[2568]: I0416 14:59:03.864179 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e529091f-ecab-4e5f-bc3f-7432720ba8e4" path="/var/lib/kubelet/pods/e529091f-ecab-4e5f-bc3f-7432720ba8e4/volumes" Apr 16 15:01:03.784312 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:03.784274 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7"] Apr 16 15:01:03.784809 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:03.784620 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e529091f-ecab-4e5f-bc3f-7432720ba8e4" containerName="console" Apr 16 15:01:03.784809 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:03.784632 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e529091f-ecab-4e5f-bc3f-7432720ba8e4" containerName="console" Apr 16 15:01:03.784809 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:03.784685 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e529091f-ecab-4e5f-bc3f-7432720ba8e4" containerName="console" Apr 16 15:01:03.787939 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:03.787920 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" Apr 16 15:01:03.790408 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:03.790386 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:01:03.791235 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:03.791210 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:01:03.791235 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:03.791229 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-pzc87\"" Apr 16 15:01:03.802806 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:03.802783 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7"] Apr 16 15:01:03.928931 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:03.928905 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef1e1210-837b-49ee-bc9a-0e855dec407e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7\" (UID: \"ef1e1210-837b-49ee-bc9a-0e855dec407e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" Apr 16 15:01:03.929083 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:03.929064 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef1e1210-837b-49ee-bc9a-0e855dec407e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7\" (UID: \"ef1e1210-837b-49ee-bc9a-0e855dec407e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" Apr 16 15:01:03.929141 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:03.929117 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58bpg\" (UniqueName: \"kubernetes.io/projected/ef1e1210-837b-49ee-bc9a-0e855dec407e-kube-api-access-58bpg\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7\" (UID: \"ef1e1210-837b-49ee-bc9a-0e855dec407e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" Apr 16 15:01:04.030188 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:04.030163 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef1e1210-837b-49ee-bc9a-0e855dec407e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7\" (UID: \"ef1e1210-837b-49ee-bc9a-0e855dec407e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" Apr 16 15:01:04.030308 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:04.030198 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58bpg\" (UniqueName: \"kubernetes.io/projected/ef1e1210-837b-49ee-bc9a-0e855dec407e-kube-api-access-58bpg\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7\" (UID: \"ef1e1210-837b-49ee-bc9a-0e855dec407e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" Apr 16 15:01:04.030308 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:04.030230 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef1e1210-837b-49ee-bc9a-0e855dec407e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7\" (UID: \"ef1e1210-837b-49ee-bc9a-0e855dec407e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" Apr 16 15:01:04.030577 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:04.030553 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef1e1210-837b-49ee-bc9a-0e855dec407e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7\" (UID: \"ef1e1210-837b-49ee-bc9a-0e855dec407e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" Apr 16 15:01:04.030640 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:04.030599 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef1e1210-837b-49ee-bc9a-0e855dec407e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7\" (UID: \"ef1e1210-837b-49ee-bc9a-0e855dec407e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" Apr 16 15:01:04.038173 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:04.038114 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58bpg\" (UniqueName: \"kubernetes.io/projected/ef1e1210-837b-49ee-bc9a-0e855dec407e-kube-api-access-58bpg\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7\" (UID: \"ef1e1210-837b-49ee-bc9a-0e855dec407e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" Apr 16 15:01:04.097178 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:04.097157 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" Apr 16 15:01:04.216439 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:04.216417 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7"] Apr 16 15:01:04.218298 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:01:04.218264 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef1e1210_837b_49ee_bc9a_0e855dec407e.slice/crio-41cdba6da4ab9441b4fefe6118ff49106357c331e395617b9c86ed6c3c1a8a97 WatchSource:0}: Error finding container 41cdba6da4ab9441b4fefe6118ff49106357c331e395617b9c86ed6c3c1a8a97: Status 404 returned error can't find the container with id 41cdba6da4ab9441b4fefe6118ff49106357c331e395617b9c86ed6c3c1a8a97 Apr 16 15:01:04.402404 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:04.402370 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" event={"ID":"ef1e1210-837b-49ee-bc9a-0e855dec407e","Type":"ContainerStarted","Data":"41cdba6da4ab9441b4fefe6118ff49106357c331e395617b9c86ed6c3c1a8a97"} Apr 16 15:01:09.420266 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:09.420230 2568 generic.go:358] "Generic (PLEG): container finished" podID="ef1e1210-837b-49ee-bc9a-0e855dec407e" containerID="dee026f86c38399202c72699d0d1c289fe5e3e19b4dfa87584e63de84f539f2d" exitCode=0 Apr 16 15:01:09.420659 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:09.420323 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" event={"ID":"ef1e1210-837b-49ee-bc9a-0e855dec407e","Type":"ContainerDied","Data":"dee026f86c38399202c72699d0d1c289fe5e3e19b4dfa87584e63de84f539f2d"} Apr 16 15:01:12.431568 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:12.431538 2568 generic.go:358] "Generic (PLEG): container finished" podID="ef1e1210-837b-49ee-bc9a-0e855dec407e" containerID="cad3917fb9becb21027bfe0d982a4d96048de7288ea85c978b7271b38327c6a2" exitCode=0 Apr 16 15:01:12.431959 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:12.431631 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" event={"ID":"ef1e1210-837b-49ee-bc9a-0e855dec407e","Type":"ContainerDied","Data":"cad3917fb9becb21027bfe0d982a4d96048de7288ea85c978b7271b38327c6a2"} Apr 16 15:01:18.454819 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:18.454787 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" event={"ID":"ef1e1210-837b-49ee-bc9a-0e855dec407e","Type":"ContainerStarted","Data":"0823d667f5f9fa2d23083ac71bb1de8a0b616f286ea4e344b7839daf81680cd7"} Apr 16 15:01:18.472126 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:18.472085 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" podStartSLOduration=1.392151929 podStartE2EDuration="15.472072279s" podCreationTimestamp="2026-04-16 15:01:03 +0000 UTC" firstStartedPulling="2026-04-16 15:01:04.22002257 +0000 UTC m=+502.983974660" lastFinishedPulling="2026-04-16 15:01:18.299942918 +0000 UTC m=+517.063895010" observedRunningTime="2026-04-16 15:01:18.470127959 +0000 UTC m=+517.234080070" watchObservedRunningTime="2026-04-16 15:01:18.472072279 +0000 UTC m=+517.236024390" Apr 16 15:01:19.463693 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:19.463663 2568 generic.go:358] "Generic (PLEG): container finished" podID="ef1e1210-837b-49ee-bc9a-0e855dec407e" containerID="0823d667f5f9fa2d23083ac71bb1de8a0b616f286ea4e344b7839daf81680cd7" exitCode=0 Apr 16 15:01:19.464039 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:19.463740 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" event={"ID":"ef1e1210-837b-49ee-bc9a-0e855dec407e","Type":"ContainerDied","Data":"0823d667f5f9fa2d23083ac71bb1de8a0b616f286ea4e344b7839daf81680cd7"} Apr 16 15:01:20.592595 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:20.592575 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" Apr 16 15:01:20.756726 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:20.756669 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58bpg\" (UniqueName: \"kubernetes.io/projected/ef1e1210-837b-49ee-bc9a-0e855dec407e-kube-api-access-58bpg\") pod \"ef1e1210-837b-49ee-bc9a-0e855dec407e\" (UID: \"ef1e1210-837b-49ee-bc9a-0e855dec407e\") " Apr 16 15:01:20.756726 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:20.756705 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef1e1210-837b-49ee-bc9a-0e855dec407e-util\") pod \"ef1e1210-837b-49ee-bc9a-0e855dec407e\" (UID: \"ef1e1210-837b-49ee-bc9a-0e855dec407e\") " Apr 16 15:01:20.756917 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:20.756771 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef1e1210-837b-49ee-bc9a-0e855dec407e-bundle\") pod \"ef1e1210-837b-49ee-bc9a-0e855dec407e\" (UID: \"ef1e1210-837b-49ee-bc9a-0e855dec407e\") " Apr 16 15:01:20.757337 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:20.757315 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef1e1210-837b-49ee-bc9a-0e855dec407e-bundle" (OuterVolumeSpecName: "bundle") pod "ef1e1210-837b-49ee-bc9a-0e855dec407e" (UID: "ef1e1210-837b-49ee-bc9a-0e855dec407e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:01:20.758823 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:20.758799 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1e1210-837b-49ee-bc9a-0e855dec407e-kube-api-access-58bpg" (OuterVolumeSpecName: "kube-api-access-58bpg") pod "ef1e1210-837b-49ee-bc9a-0e855dec407e" (UID: "ef1e1210-837b-49ee-bc9a-0e855dec407e"). InnerVolumeSpecName "kube-api-access-58bpg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:01:20.760752 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:20.760732 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef1e1210-837b-49ee-bc9a-0e855dec407e-util" (OuterVolumeSpecName: "util") pod "ef1e1210-837b-49ee-bc9a-0e855dec407e" (UID: "ef1e1210-837b-49ee-bc9a-0e855dec407e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:01:20.857701 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:20.857676 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-58bpg\" (UniqueName: \"kubernetes.io/projected/ef1e1210-837b-49ee-bc9a-0e855dec407e-kube-api-access-58bpg\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:01:20.857701 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:20.857698 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef1e1210-837b-49ee-bc9a-0e855dec407e-util\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:01:20.857829 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:20.857708 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef1e1210-837b-49ee-bc9a-0e855dec407e-bundle\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:01:21.470862 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:21.470831 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" event={"ID":"ef1e1210-837b-49ee-bc9a-0e855dec407e","Type":"ContainerDied","Data":"41cdba6da4ab9441b4fefe6118ff49106357c331e395617b9c86ed6c3c1a8a97"} Apr 16 15:01:21.471012 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:21.470870 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41cdba6da4ab9441b4fefe6118ff49106357c331e395617b9c86ed6c3c1a8a97" Apr 16 15:01:21.471012 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:21.470843 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckmsl7" Apr 16 15:01:25.436142 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.436084 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8"] Apr 16 15:01:25.436796 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.436445 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef1e1210-837b-49ee-bc9a-0e855dec407e" containerName="extract" Apr 16 15:01:25.436796 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.436458 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1e1210-837b-49ee-bc9a-0e855dec407e" containerName="extract" Apr 16 15:01:25.436796 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.436477 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef1e1210-837b-49ee-bc9a-0e855dec407e" containerName="util" Apr 16 15:01:25.436796 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.436484 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1e1210-837b-49ee-bc9a-0e855dec407e" containerName="util" Apr 16 15:01:25.436796 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.436498 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef1e1210-837b-49ee-bc9a-0e855dec407e" containerName="pull" Apr 16 15:01:25.436796 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.436505 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1e1210-837b-49ee-bc9a-0e855dec407e" containerName="pull" Apr 16 15:01:25.436796 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.436573 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef1e1210-837b-49ee-bc9a-0e855dec407e" containerName="extract" Apr 16 15:01:25.443107 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.443085 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8" Apr 16 15:01:25.445476 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.445451 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-c9t5b\"" Apr 16 15:01:25.445587 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.445478 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 15:01:25.445587 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.445530 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 15:01:25.445770 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.445754 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 15:01:25.448536 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.448511 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8"] Apr 16 15:01:25.587729 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.587704 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc4nd\" (UniqueName: \"kubernetes.io/projected/55b08a4a-2db6-43f3-ab86-0dc630d8a66a-kube-api-access-lc4nd\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8\" (UID: \"55b08a4a-2db6-43f3-ab86-0dc630d8a66a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8" Apr 16 15:01:25.587828 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.587752 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/55b08a4a-2db6-43f3-ab86-0dc630d8a66a-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8\" (UID: \"55b08a4a-2db6-43f3-ab86-0dc630d8a66a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8" Apr 16 15:01:25.688662 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.688605 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/55b08a4a-2db6-43f3-ab86-0dc630d8a66a-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8\" (UID: \"55b08a4a-2db6-43f3-ab86-0dc630d8a66a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8" Apr 16 15:01:25.688750 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.688662 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc4nd\" (UniqueName: \"kubernetes.io/projected/55b08a4a-2db6-43f3-ab86-0dc630d8a66a-kube-api-access-lc4nd\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8\" (UID: \"55b08a4a-2db6-43f3-ab86-0dc630d8a66a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8" Apr 16 15:01:25.691043 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.691016 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/55b08a4a-2db6-43f3-ab86-0dc630d8a66a-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8\" (UID: \"55b08a4a-2db6-43f3-ab86-0dc630d8a66a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8" Apr 16 15:01:25.699007 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.698987 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc4nd\" (UniqueName: \"kubernetes.io/projected/55b08a4a-2db6-43f3-ab86-0dc630d8a66a-kube-api-access-lc4nd\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8\" (UID: \"55b08a4a-2db6-43f3-ab86-0dc630d8a66a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8" Apr 16 15:01:25.754058 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.754032 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8" Apr 16 15:01:25.883981 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:25.883931 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8"] Apr 16 15:01:25.886010 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:01:25.885985 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b08a4a_2db6_43f3_ab86_0dc630d8a66a.slice/crio-259ad576dceaa20c39a26d2a103ca2397460ee97a1e1f3630b65de257ed6459b WatchSource:0}: Error finding container 259ad576dceaa20c39a26d2a103ca2397460ee97a1e1f3630b65de257ed6459b: Status 404 returned error can't find the container with id 259ad576dceaa20c39a26d2a103ca2397460ee97a1e1f3630b65de257ed6459b Apr 16 15:01:26.494995 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:26.494954 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8" event={"ID":"55b08a4a-2db6-43f3-ab86-0dc630d8a66a","Type":"ContainerStarted","Data":"259ad576dceaa20c39a26d2a103ca2397460ee97a1e1f3630b65de257ed6459b"} Apr 16 15:01:29.475177 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.475149 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-gbprp"] Apr 16 15:01:29.478355 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.478336 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:29.480616 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.480580 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 15:01:29.480710 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.480613 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 15:01:29.480710 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.480627 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-nvnmh\"" Apr 16 15:01:29.486934 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.486908 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-gbprp"] Apr 16 15:01:29.508447 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.508415 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8" event={"ID":"55b08a4a-2db6-43f3-ab86-0dc630d8a66a","Type":"ContainerStarted","Data":"5b8569034faa381436e37fa772697e590e86b447a58feccca1de19b401b8484b"} Apr 16 15:01:29.508546 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.508527 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8" Apr 16 15:01:29.528048 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.527976 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8" podStartSLOduration=1.432810539 podStartE2EDuration="4.527965773s" podCreationTimestamp="2026-04-16 15:01:25 +0000 UTC" firstStartedPulling="2026-04-16 15:01:25.887538106 +0000 UTC m=+524.651490198" lastFinishedPulling="2026-04-16 15:01:28.982693338 +0000 UTC m=+527.746645432" observedRunningTime="2026-04-16 15:01:29.526601419 +0000 UTC m=+528.290553533" watchObservedRunningTime="2026-04-16 15:01:29.527965773 +0000 UTC m=+528.291917884" Apr 16 15:01:29.622542 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.622520 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-certificates\") pod \"keda-operator-ffbb595cb-gbprp\" (UID: \"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6\") " pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:29.622630 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.622564 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-cabundle0\") pod \"keda-operator-ffbb595cb-gbprp\" (UID: \"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6\") " pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:29.622630 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.622595 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt5jq\" (UniqueName: \"kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-kube-api-access-kt5jq\") pod \"keda-operator-ffbb595cb-gbprp\" (UID: \"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6\") " pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:29.689008 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.688984 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6"] Apr 16 15:01:29.692325 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.692310 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:29.694546 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.694529 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 15:01:29.702657 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.702632 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6"] Apr 16 15:01:29.723705 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.723686 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-certificates\") pod \"keda-operator-ffbb595cb-gbprp\" (UID: \"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6\") " pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:29.723791 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.723724 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-cabundle0\") pod \"keda-operator-ffbb595cb-gbprp\" (UID: \"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6\") " pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:29.723791 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.723747 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kt5jq\" (UniqueName: \"kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-kube-api-access-kt5jq\") pod \"keda-operator-ffbb595cb-gbprp\" (UID: \"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6\") " pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:29.723882 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:29.723839 2568 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:01:29.723882 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:29.723859 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:01:29.723882 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:29.723869 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gbprp: references non-existent secret key: ca.crt Apr 16 15:01:29.724008 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:29.723940 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-certificates podName:dd5b89c4-09cc-4d16-88bd-d52cb4b883b6 nodeName:}" failed. No retries permitted until 2026-04-16 15:01:30.223921903 +0000 UTC m=+528.987874008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-certificates") pod "keda-operator-ffbb595cb-gbprp" (UID: "dd5b89c4-09cc-4d16-88bd-d52cb4b883b6") : references non-existent secret key: ca.crt Apr 16 15:01:29.724341 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.724326 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-cabundle0\") pod \"keda-operator-ffbb595cb-gbprp\" (UID: \"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6\") " pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:29.731702 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.731651 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt5jq\" (UniqueName: \"kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-kube-api-access-kt5jq\") pod \"keda-operator-ffbb595cb-gbprp\" (UID: \"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6\") " pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:29.825076 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.825053 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-certificates\") pod \"keda-metrics-apiserver-7c9f485588-f9bm6\" (UID: \"4f3256a7-0ff1-4399-8456-5bc96edcb410\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:29.825169 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.825086 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmvk5\" (UniqueName: \"kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-kube-api-access-pmvk5\") pod \"keda-metrics-apiserver-7c9f485588-f9bm6\" (UID: \"4f3256a7-0ff1-4399-8456-5bc96edcb410\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:29.825214 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.825195 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/4f3256a7-0ff1-4399-8456-5bc96edcb410-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-f9bm6\" (UID: \"4f3256a7-0ff1-4399-8456-5bc96edcb410\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:29.926449 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.926428 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/4f3256a7-0ff1-4399-8456-5bc96edcb410-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-f9bm6\" (UID: \"4f3256a7-0ff1-4399-8456-5bc96edcb410\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:29.926600 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.926518 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-certificates\") pod \"keda-metrics-apiserver-7c9f485588-f9bm6\" (UID: \"4f3256a7-0ff1-4399-8456-5bc96edcb410\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:29.926600 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.926553 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmvk5\" (UniqueName: \"kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-kube-api-access-pmvk5\") pod \"keda-metrics-apiserver-7c9f485588-f9bm6\" (UID: \"4f3256a7-0ff1-4399-8456-5bc96edcb410\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:29.926711 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:29.926654 2568 secret.go:281] references non-existent secret key: tls.crt Apr 16 15:01:29.926711 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:29.926674 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 15:01:29.926711 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:29.926699 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6: references non-existent secret key: tls.crt Apr 16 15:01:29.926847 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:29.926757 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-certificates podName:4f3256a7-0ff1-4399-8456-5bc96edcb410 nodeName:}" failed. No retries permitted until 2026-04-16 15:01:30.426739815 +0000 UTC m=+529.190691918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-certificates") pod "keda-metrics-apiserver-7c9f485588-f9bm6" (UID: "4f3256a7-0ff1-4399-8456-5bc96edcb410") : references non-existent secret key: tls.crt Apr 16 15:01:29.926847 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.926820 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/4f3256a7-0ff1-4399-8456-5bc96edcb410-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-f9bm6\" (UID: \"4f3256a7-0ff1-4399-8456-5bc96edcb410\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:29.935502 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.935483 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmvk5\" (UniqueName: \"kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-kube-api-access-pmvk5\") pod \"keda-metrics-apiserver-7c9f485588-f9bm6\" (UID: \"4f3256a7-0ff1-4399-8456-5bc96edcb410\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:29.949264 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.949244 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-xblsd"] Apr 16 15:01:29.952636 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.952623 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-xblsd" Apr 16 15:01:29.954868 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.954851 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 15:01:29.962388 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:29.962366 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-xblsd"] Apr 16 15:01:30.128603 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:30.128583 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvq2x\" (UniqueName: \"kubernetes.io/projected/f55fa40e-fda1-499b-9b3a-f772ab20d0cf-kube-api-access-qvq2x\") pod \"keda-admission-cf49989db-xblsd\" (UID: \"f55fa40e-fda1-499b-9b3a-f772ab20d0cf\") " pod="openshift-keda/keda-admission-cf49989db-xblsd" Apr 16 15:01:30.128719 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:30.128612 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f55fa40e-fda1-499b-9b3a-f772ab20d0cf-certificates\") pod \"keda-admission-cf49989db-xblsd\" (UID: \"f55fa40e-fda1-499b-9b3a-f772ab20d0cf\") " pod="openshift-keda/keda-admission-cf49989db-xblsd" Apr 16 15:01:30.229623 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:30.229600 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-certificates\") pod \"keda-operator-ffbb595cb-gbprp\" (UID: \"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6\") " pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:30.229707 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:30.229657 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvq2x\" (UniqueName: \"kubernetes.io/projected/f55fa40e-fda1-499b-9b3a-f772ab20d0cf-kube-api-access-qvq2x\") pod \"keda-admission-cf49989db-xblsd\" (UID: \"f55fa40e-fda1-499b-9b3a-f772ab20d0cf\") " pod="openshift-keda/keda-admission-cf49989db-xblsd" Apr 16 15:01:30.229707 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:30.229674 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f55fa40e-fda1-499b-9b3a-f772ab20d0cf-certificates\") pod \"keda-admission-cf49989db-xblsd\" (UID: \"f55fa40e-fda1-499b-9b3a-f772ab20d0cf\") " pod="openshift-keda/keda-admission-cf49989db-xblsd" Apr 16 15:01:30.229784 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:30.229734 2568 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:01:30.229784 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:30.229756 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:01:30.229784 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:30.229768 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gbprp: references non-existent secret key: ca.crt Apr 16 15:01:30.229910 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:30.229829 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-certificates podName:dd5b89c4-09cc-4d16-88bd-d52cb4b883b6 nodeName:}" failed. No retries permitted until 2026-04-16 15:01:31.229812809 +0000 UTC m=+529.993764911 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-certificates") pod "keda-operator-ffbb595cb-gbprp" (UID: "dd5b89c4-09cc-4d16-88bd-d52cb4b883b6") : references non-existent secret key: ca.crt Apr 16 15:01:30.229910 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:30.229767 2568 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 15:01:30.229910 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:30.229856 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-xblsd: secret "keda-admission-webhooks-certs" not found Apr 16 15:01:30.229910 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:30.229909 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55fa40e-fda1-499b-9b3a-f772ab20d0cf-certificates podName:f55fa40e-fda1-499b-9b3a-f772ab20d0cf nodeName:}" failed. No retries permitted until 2026-04-16 15:01:30.729878505 +0000 UTC m=+529.493830593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f55fa40e-fda1-499b-9b3a-f772ab20d0cf-certificates") pod "keda-admission-cf49989db-xblsd" (UID: "f55fa40e-fda1-499b-9b3a-f772ab20d0cf") : secret "keda-admission-webhooks-certs" not found Apr 16 15:01:30.245763 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:30.245733 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvq2x\" (UniqueName: \"kubernetes.io/projected/f55fa40e-fda1-499b-9b3a-f772ab20d0cf-kube-api-access-qvq2x\") pod \"keda-admission-cf49989db-xblsd\" (UID: \"f55fa40e-fda1-499b-9b3a-f772ab20d0cf\") " pod="openshift-keda/keda-admission-cf49989db-xblsd" Apr 16 15:01:30.431508 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:30.431427 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-certificates\") pod \"keda-metrics-apiserver-7c9f485588-f9bm6\" (UID: \"4f3256a7-0ff1-4399-8456-5bc96edcb410\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:30.431590 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:30.431560 2568 secret.go:281] references non-existent secret key: tls.crt Apr 16 15:01:30.431590 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:30.431577 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 15:01:30.431667 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:30.431595 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6: references non-existent secret key: tls.crt Apr 16 15:01:30.431667 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:30.431638 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-certificates podName:4f3256a7-0ff1-4399-8456-5bc96edcb410 nodeName:}" failed. No retries permitted until 2026-04-16 15:01:31.43162613 +0000 UTC m=+530.195578224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-certificates") pod "keda-metrics-apiserver-7c9f485588-f9bm6" (UID: "4f3256a7-0ff1-4399-8456-5bc96edcb410") : references non-existent secret key: tls.crt Apr 16 15:01:30.735850 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:30.735777 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f55fa40e-fda1-499b-9b3a-f772ab20d0cf-certificates\") pod \"keda-admission-cf49989db-xblsd\" (UID: \"f55fa40e-fda1-499b-9b3a-f772ab20d0cf\") " pod="openshift-keda/keda-admission-cf49989db-xblsd" Apr 16 15:01:30.738063 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:30.738038 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f55fa40e-fda1-499b-9b3a-f772ab20d0cf-certificates\") pod \"keda-admission-cf49989db-xblsd\" (UID: \"f55fa40e-fda1-499b-9b3a-f772ab20d0cf\") " pod="openshift-keda/keda-admission-cf49989db-xblsd" Apr 16 15:01:30.862783 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:30.862757 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-xblsd" Apr 16 15:01:31.197814 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:31.197789 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-xblsd"] Apr 16 15:01:31.200136 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:01:31.200109 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf55fa40e_fda1_499b_9b3a_f772ab20d0cf.slice/crio-d52971fcbc124b6d4753ad7d687309c10a24408eb39015c0dff07191a739309b WatchSource:0}: Error finding container d52971fcbc124b6d4753ad7d687309c10a24408eb39015c0dff07191a739309b: Status 404 returned error can't find the container with id d52971fcbc124b6d4753ad7d687309c10a24408eb39015c0dff07191a739309b Apr 16 15:01:31.240265 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:31.240240 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-certificates\") pod \"keda-operator-ffbb595cb-gbprp\" (UID: \"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6\") " pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:31.240359 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:31.240350 2568 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:01:31.240407 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:31.240361 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:01:31.240407 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:31.240369 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gbprp: references non-existent secret key: ca.crt Apr 16 15:01:31.240482 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:31.240418 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-certificates podName:dd5b89c4-09cc-4d16-88bd-d52cb4b883b6 nodeName:}" failed. No retries permitted until 2026-04-16 15:01:33.240405551 +0000 UTC m=+532.004357640 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-certificates") pod "keda-operator-ffbb595cb-gbprp" (UID: "dd5b89c4-09cc-4d16-88bd-d52cb4b883b6") : references non-existent secret key: ca.crt Apr 16 15:01:31.442345 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:31.442321 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-certificates\") pod \"keda-metrics-apiserver-7c9f485588-f9bm6\" (UID: \"4f3256a7-0ff1-4399-8456-5bc96edcb410\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:31.442478 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:31.442462 2568 secret.go:281] references non-existent secret key: tls.crt Apr 16 15:01:31.442516 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:31.442481 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 15:01:31.442516 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:31.442500 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6: references non-existent secret key: tls.crt Apr 16 15:01:31.442576 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:01:31.442553 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-certificates podName:4f3256a7-0ff1-4399-8456-5bc96edcb410 nodeName:}" failed. No retries permitted until 2026-04-16 15:01:33.442540321 +0000 UTC m=+532.206492409 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-certificates") pod "keda-metrics-apiserver-7c9f485588-f9bm6" (UID: "4f3256a7-0ff1-4399-8456-5bc96edcb410") : references non-existent secret key: tls.crt Apr 16 15:01:31.516174 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:31.516122 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-xblsd" event={"ID":"f55fa40e-fda1-499b-9b3a-f772ab20d0cf","Type":"ContainerStarted","Data":"d52971fcbc124b6d4753ad7d687309c10a24408eb39015c0dff07191a739309b"} Apr 16 15:01:33.257693 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:33.257665 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-certificates\") pod \"keda-operator-ffbb595cb-gbprp\" (UID: \"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6\") " pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:33.259846 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:33.259828 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dd5b89c4-09cc-4d16-88bd-d52cb4b883b6-certificates\") pod \"keda-operator-ffbb595cb-gbprp\" (UID: \"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6\") " pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:33.388220 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:33.388191 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:33.459293 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:33.459267 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-certificates\") pod \"keda-metrics-apiserver-7c9f485588-f9bm6\" (UID: \"4f3256a7-0ff1-4399-8456-5bc96edcb410\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:33.461513 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:33.461487 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4f3256a7-0ff1-4399-8456-5bc96edcb410-certificates\") pod \"keda-metrics-apiserver-7c9f485588-f9bm6\" (UID: \"4f3256a7-0ff1-4399-8456-5bc96edcb410\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:33.504704 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:33.504682 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-gbprp"] Apr 16 15:01:33.507096 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:01:33.507073 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd5b89c4_09cc_4d16_88bd_d52cb4b883b6.slice/crio-ce2fb918ff7a73b4b7234b4b89fb5cd9edef9ab6bf42794c187ca34a5d70d05e WatchSource:0}: Error finding container ce2fb918ff7a73b4b7234b4b89fb5cd9edef9ab6bf42794c187ca34a5d70d05e: Status 404 returned error can't find the container with id ce2fb918ff7a73b4b7234b4b89fb5cd9edef9ab6bf42794c187ca34a5d70d05e Apr 16 15:01:33.524209 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:33.524184 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-gbprp" event={"ID":"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6","Type":"ContainerStarted","Data":"ce2fb918ff7a73b4b7234b4b89fb5cd9edef9ab6bf42794c187ca34a5d70d05e"} Apr 16 15:01:33.525297 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:33.525277 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-xblsd" event={"ID":"f55fa40e-fda1-499b-9b3a-f772ab20d0cf","Type":"ContainerStarted","Data":"d1bdb1dd867fa3b51776d2e6d50216d8a451c2c59e202db2579ed85d56fc783a"} Apr 16 15:01:33.525441 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:33.525426 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-xblsd" Apr 16 15:01:33.545623 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:33.545582 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-xblsd" podStartSLOduration=3.105484742 podStartE2EDuration="4.545571024s" podCreationTimestamp="2026-04-16 15:01:29 +0000 UTC" firstStartedPulling="2026-04-16 15:01:31.201862806 +0000 UTC m=+529.965814901" lastFinishedPulling="2026-04-16 15:01:32.641949091 +0000 UTC m=+531.405901183" observedRunningTime="2026-04-16 15:01:33.545266319 +0000 UTC m=+532.309218429" watchObservedRunningTime="2026-04-16 15:01:33.545571024 +0000 UTC m=+532.309523164" Apr 16 15:01:33.604665 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:33.604643 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:33.719344 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:33.719320 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6"] Apr 16 15:01:33.721735 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:01:33.721710 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f3256a7_0ff1_4399_8456_5bc96edcb410.slice/crio-4b7cca1b149f5c6be6eda1eb86abced0e7f82cefe6e7ff9eb16638b02d32e181 WatchSource:0}: Error finding container 4b7cca1b149f5c6be6eda1eb86abced0e7f82cefe6e7ff9eb16638b02d32e181: Status 404 returned error can't find the container with id 4b7cca1b149f5c6be6eda1eb86abced0e7f82cefe6e7ff9eb16638b02d32e181 Apr 16 15:01:34.530273 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:34.530235 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" event={"ID":"4f3256a7-0ff1-4399-8456-5bc96edcb410","Type":"ContainerStarted","Data":"4b7cca1b149f5c6be6eda1eb86abced0e7f82cefe6e7ff9eb16638b02d32e181"} Apr 16 15:01:37.542843 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:37.542751 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" event={"ID":"4f3256a7-0ff1-4399-8456-5bc96edcb410","Type":"ContainerStarted","Data":"044d71e8f503b3308d935e28cbdefaa9caa4f88013f08c6edd91c8f608fe1618"} Apr 16 15:01:37.543268 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:37.542863 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:37.544194 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:37.544169 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-gbprp" event={"ID":"dd5b89c4-09cc-4d16-88bd-d52cb4b883b6","Type":"ContainerStarted","Data":"61c4a5ad01b409e5bedefe15a1454d7d9dbd831d1922826509040a8182afe3ce"} Apr 16 15:01:37.544338 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:37.544254 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:01:37.559613 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:37.559572 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" podStartSLOduration=5.007613852 podStartE2EDuration="8.559561158s" podCreationTimestamp="2026-04-16 15:01:29 +0000 UTC" firstStartedPulling="2026-04-16 15:01:33.72309159 +0000 UTC m=+532.487043682" lastFinishedPulling="2026-04-16 15:01:37.2750389 +0000 UTC m=+536.038990988" observedRunningTime="2026-04-16 15:01:37.55794734 +0000 UTC m=+536.321899451" watchObservedRunningTime="2026-04-16 15:01:37.559561158 +0000 UTC m=+536.323513269" Apr 16 15:01:37.574491 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:37.574451 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-gbprp" podStartSLOduration=4.80261323 podStartE2EDuration="8.574438002s" podCreationTimestamp="2026-04-16 15:01:29 +0000 UTC" firstStartedPulling="2026-04-16 15:01:33.508496254 +0000 UTC m=+532.272448344" lastFinishedPulling="2026-04-16 15:01:37.280321028 +0000 UTC m=+536.044273116" observedRunningTime="2026-04-16 15:01:37.573593932 +0000 UTC m=+536.337546039" watchObservedRunningTime="2026-04-16 15:01:37.574438002 +0000 UTC m=+536.338390113" Apr 16 15:01:48.551848 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:48.551816 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-f9bm6" Apr 16 15:01:50.514375 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:50.514343 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f4ks8" Apr 16 15:01:54.532823 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:54.532788 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-xblsd" Apr 16 15:01:58.549772 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:01:58.549738 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-gbprp" Apr 16 15:02:23.791589 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:23.791553 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng"] Apr 16 15:02:23.802817 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:23.802787 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng"] Apr 16 15:02:23.802964 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:23.802947 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" Apr 16 15:02:23.805302 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:23.805274 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:02:23.805446 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:23.805282 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-pzc87\"" Apr 16 15:02:23.806236 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:23.806215 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:02:23.929179 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:23.929151 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng\" (UID: \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" Apr 16 15:02:23.929294 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:23.929184 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvq52\" (UniqueName: \"kubernetes.io/projected/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-kube-api-access-hvq52\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng\" (UID: \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" Apr 16 15:02:23.929294 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:23.929279 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng\" (UID: \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" Apr 16 15:02:24.030342 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:24.030315 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng\" (UID: \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" Apr 16 15:02:24.030473 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:24.030366 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng\" (UID: \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" Apr 16 15:02:24.030473 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:24.030382 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvq52\" (UniqueName: \"kubernetes.io/projected/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-kube-api-access-hvq52\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng\" (UID: \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" Apr 16 15:02:24.030676 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:24.030653 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng\" (UID: \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" Apr 16 15:02:24.030740 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:24.030687 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng\" (UID: \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" Apr 16 15:02:24.038206 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:24.038186 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvq52\" (UniqueName: \"kubernetes.io/projected/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-kube-api-access-hvq52\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng\" (UID: \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" Apr 16 15:02:24.112876 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:24.112853 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" Apr 16 15:02:24.231757 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:24.231735 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng"] Apr 16 15:02:24.233701 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:02:24.233677 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d3cd92c_9a47_4ecf_b5d5_eb4fbca1251f.slice/crio-b7331f2f8ca4b9a85096794e6dff2f1cc389fec301814ce1bb73bb518c5777b7 WatchSource:0}: Error finding container b7331f2f8ca4b9a85096794e6dff2f1cc389fec301814ce1bb73bb518c5777b7: Status 404 returned error can't find the container with id b7331f2f8ca4b9a85096794e6dff2f1cc389fec301814ce1bb73bb518c5777b7 Apr 16 15:02:24.713050 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:24.713018 2568 generic.go:358] "Generic (PLEG): container finished" podID="7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f" containerID="b9be50c3e37969b6616b77e0039baf1269c86bcef4735318c6914474ed94a8fd" exitCode=0 Apr 16 15:02:24.713184 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:24.713103 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" event={"ID":"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f","Type":"ContainerDied","Data":"b9be50c3e37969b6616b77e0039baf1269c86bcef4735318c6914474ed94a8fd"} Apr 16 15:02:24.713184 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:24.713148 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" event={"ID":"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f","Type":"ContainerStarted","Data":"b7331f2f8ca4b9a85096794e6dff2f1cc389fec301814ce1bb73bb518c5777b7"} Apr 16 15:02:25.718672 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:25.718604 2568 generic.go:358] "Generic (PLEG): container finished" podID="7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f" containerID="c312402bb123331e6d0943ddada4ea88daf5f4b4f7a064e2485073627e4cf3ae" exitCode=0 Apr 16 15:02:25.718672 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:25.718642 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" event={"ID":"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f","Type":"ContainerDied","Data":"c312402bb123331e6d0943ddada4ea88daf5f4b4f7a064e2485073627e4cf3ae"} Apr 16 15:02:25.997631 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:25.997576 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f5976dc55-sg285"] Apr 16 15:02:26.000711 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.000697 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.012100 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.012082 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f5976dc55-sg285"] Apr 16 15:02:26.146269 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.146247 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1bd51ae-9001-494f-85cb-1225af7bb1e9-console-config\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.146392 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.146284 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1bd51ae-9001-494f-85cb-1225af7bb1e9-console-oauth-config\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.146392 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.146307 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1bd51ae-9001-494f-85cb-1225af7bb1e9-oauth-serving-cert\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.146392 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.146325 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsg6l\" (UniqueName: \"kubernetes.io/projected/a1bd51ae-9001-494f-85cb-1225af7bb1e9-kube-api-access-zsg6l\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.146392 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.146345 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1bd51ae-9001-494f-85cb-1225af7bb1e9-service-ca\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.146525 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.146431 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1bd51ae-9001-494f-85cb-1225af7bb1e9-console-serving-cert\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.146525 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.146464 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1bd51ae-9001-494f-85cb-1225af7bb1e9-trusted-ca-bundle\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.247382 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.247358 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1bd51ae-9001-494f-85cb-1225af7bb1e9-service-ca\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.247484 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.247406 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1bd51ae-9001-494f-85cb-1225af7bb1e9-console-serving-cert\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.247686 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.247644 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1bd51ae-9001-494f-85cb-1225af7bb1e9-trusted-ca-bundle\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.247755 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.247701 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1bd51ae-9001-494f-85cb-1225af7bb1e9-console-config\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.247755 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.247739 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1bd51ae-9001-494f-85cb-1225af7bb1e9-console-oauth-config\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.247867 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.247756 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1bd51ae-9001-494f-85cb-1225af7bb1e9-oauth-serving-cert\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.247867 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.247782 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsg6l\" (UniqueName: \"kubernetes.io/projected/a1bd51ae-9001-494f-85cb-1225af7bb1e9-kube-api-access-zsg6l\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.248092 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.248067 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1bd51ae-9001-494f-85cb-1225af7bb1e9-service-ca\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.248448 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.248423 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1bd51ae-9001-494f-85cb-1225af7bb1e9-console-config\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.248536 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.248485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1bd51ae-9001-494f-85cb-1225af7bb1e9-trusted-ca-bundle\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.248690 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.248665 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1bd51ae-9001-494f-85cb-1225af7bb1e9-oauth-serving-cert\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.250098 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.250081 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1bd51ae-9001-494f-85cb-1225af7bb1e9-console-serving-cert\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.250217 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.250201 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1bd51ae-9001-494f-85cb-1225af7bb1e9-console-oauth-config\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.257059 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.257035 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsg6l\" (UniqueName: \"kubernetes.io/projected/a1bd51ae-9001-494f-85cb-1225af7bb1e9-kube-api-access-zsg6l\") pod \"console-5f5976dc55-sg285\" (UID: \"a1bd51ae-9001-494f-85cb-1225af7bb1e9\") " pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.309971 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.309946 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:26.652188 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.652163 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f5976dc55-sg285"] Apr 16 15:02:26.654409 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:02:26.654376 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1bd51ae_9001_494f_85cb_1225af7bb1e9.slice/crio-46a551e18d9344a8c761204b74cae48b7f14b8689450ebbc61712685c658ebc8 WatchSource:0}: Error finding container 46a551e18d9344a8c761204b74cae48b7f14b8689450ebbc61712685c658ebc8: Status 404 returned error can't find the container with id 46a551e18d9344a8c761204b74cae48b7f14b8689450ebbc61712685c658ebc8 Apr 16 15:02:26.728210 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.728179 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f5976dc55-sg285" event={"ID":"a1bd51ae-9001-494f-85cb-1225af7bb1e9","Type":"ContainerStarted","Data":"46a551e18d9344a8c761204b74cae48b7f14b8689450ebbc61712685c658ebc8"} Apr 16 15:02:26.730065 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.730043 2568 generic.go:358] "Generic (PLEG): container finished" podID="7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f" containerID="0e042c55ee1a041ed53b184380da73dac147a7d872475265109013b6a01cc931" exitCode=0 Apr 16 15:02:26.730152 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:26.730122 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" event={"ID":"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f","Type":"ContainerDied","Data":"0e042c55ee1a041ed53b184380da73dac147a7d872475265109013b6a01cc931"} Apr 16 15:02:27.737910 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:27.737855 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f5976dc55-sg285" event={"ID":"a1bd51ae-9001-494f-85cb-1225af7bb1e9","Type":"ContainerStarted","Data":"60f9f3eb8d8fb72f61f3dfac2fca98d34d9cc1f42d419cf73a1f3b69dc42456e"} Apr 16 15:02:27.758073 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:27.758016 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f5976dc55-sg285" podStartSLOduration=2.757999698 podStartE2EDuration="2.757999698s" podCreationTimestamp="2026-04-16 15:02:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:02:27.75612046 +0000 UTC m=+586.520072571" watchObservedRunningTime="2026-04-16 15:02:27.757999698 +0000 UTC m=+586.521951810" Apr 16 15:02:27.865448 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:27.865429 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" Apr 16 15:02:27.963108 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:27.963079 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-bundle\") pod \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\" (UID: \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\") " Apr 16 15:02:27.963228 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:27.963148 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvq52\" (UniqueName: \"kubernetes.io/projected/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-kube-api-access-hvq52\") pod \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\" (UID: \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\") " Apr 16 15:02:27.963228 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:27.963185 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-util\") pod \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\" (UID: \"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f\") " Apr 16 15:02:27.963855 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:27.963820 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-bundle" (OuterVolumeSpecName: "bundle") pod "7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f" (UID: "7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:02:27.965273 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:27.965254 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-kube-api-access-hvq52" (OuterVolumeSpecName: "kube-api-access-hvq52") pod "7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f" (UID: "7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f"). InnerVolumeSpecName "kube-api-access-hvq52". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:02:27.971593 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:27.971568 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-util" (OuterVolumeSpecName: "util") pod "7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f" (UID: "7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:02:28.064780 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:28.064718 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hvq52\" (UniqueName: \"kubernetes.io/projected/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-kube-api-access-hvq52\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:02:28.064780 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:28.064743 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-util\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:02:28.064780 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:28.064757 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f-bundle\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:02:28.744113 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:28.744080 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" event={"ID":"7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f","Type":"ContainerDied","Data":"b7331f2f8ca4b9a85096794e6dff2f1cc389fec301814ce1bb73bb518c5777b7"} Apr 16 15:02:28.744113 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:28.744117 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7331f2f8ca4b9a85096794e6dff2f1cc389fec301814ce1bb73bb518c5777b7" Apr 16 15:02:28.744496 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:28.744286 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2zng" Apr 16 15:02:36.311874 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:36.311801 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:36.312474 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:36.311881 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:36.316856 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:36.316830 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:36.777903 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:36.777870 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f5976dc55-sg285" Apr 16 15:02:36.828138 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:36.828107 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84769dbb7f-vqdpj"] Apr 16 15:02:41.773337 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:41.773311 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 15:02:41.777289 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:41.777269 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 15:02:46.282777 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.282738 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5"] Apr 16 15:02:46.284969 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.283172 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f" containerName="pull" Apr 16 15:02:46.284969 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.283187 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f" containerName="pull" Apr 16 15:02:46.284969 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.283198 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f" containerName="extract" Apr 16 15:02:46.284969 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.283203 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f" containerName="extract" Apr 16 15:02:46.284969 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.283214 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f" containerName="util" Apr 16 15:02:46.284969 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.283220 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f" containerName="util" Apr 16 15:02:46.284969 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.283287 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d3cd92c-9a47-4ecf-b5d5-eb4fbca1251f" containerName="extract" Apr 16 15:02:46.286140 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.286121 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" Apr 16 15:02:46.288400 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.288370 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:02:46.288554 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.288465 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:02:46.289243 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.289225 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-pzc87\"" Apr 16 15:02:46.293750 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.293713 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5"] Apr 16 15:02:46.399822 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.399794 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pdn4\" (UniqueName: \"kubernetes.io/projected/efd37792-1227-4c2f-bb2d-107ec2d1b408-kube-api-access-7pdn4\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5\" (UID: \"efd37792-1227-4c2f-bb2d-107ec2d1b408\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" Apr 16 15:02:46.399951 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.399843 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/efd37792-1227-4c2f-bb2d-107ec2d1b408-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5\" (UID: \"efd37792-1227-4c2f-bb2d-107ec2d1b408\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" Apr 16 15:02:46.399951 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.399875 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/efd37792-1227-4c2f-bb2d-107ec2d1b408-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5\" (UID: \"efd37792-1227-4c2f-bb2d-107ec2d1b408\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" Apr 16 15:02:46.501136 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.501109 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/efd37792-1227-4c2f-bb2d-107ec2d1b408-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5\" (UID: \"efd37792-1227-4c2f-bb2d-107ec2d1b408\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" Apr 16 15:02:46.501243 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.501161 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdn4\" (UniqueName: \"kubernetes.io/projected/efd37792-1227-4c2f-bb2d-107ec2d1b408-kube-api-access-7pdn4\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5\" (UID: \"efd37792-1227-4c2f-bb2d-107ec2d1b408\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" Apr 16 15:02:46.501243 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.501214 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/efd37792-1227-4c2f-bb2d-107ec2d1b408-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5\" (UID: \"efd37792-1227-4c2f-bb2d-107ec2d1b408\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" Apr 16 15:02:46.501509 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.501487 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/efd37792-1227-4c2f-bb2d-107ec2d1b408-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5\" (UID: \"efd37792-1227-4c2f-bb2d-107ec2d1b408\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" Apr 16 15:02:46.501557 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.501540 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/efd37792-1227-4c2f-bb2d-107ec2d1b408-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5\" (UID: \"efd37792-1227-4c2f-bb2d-107ec2d1b408\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" Apr 16 15:02:46.509363 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.509337 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pdn4\" (UniqueName: \"kubernetes.io/projected/efd37792-1227-4c2f-bb2d-107ec2d1b408-kube-api-access-7pdn4\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5\" (UID: \"efd37792-1227-4c2f-bb2d-107ec2d1b408\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" Apr 16 15:02:46.597391 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.597365 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" Apr 16 15:02:46.716864 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.716717 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5"] Apr 16 15:02:46.719615 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:02:46.719591 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefd37792_1227_4c2f_bb2d_107ec2d1b408.slice/crio-aa685d777a5955963792702d24838beb5e04a3ad3ad065bbc45fe98cebb740a5 WatchSource:0}: Error finding container aa685d777a5955963792702d24838beb5e04a3ad3ad065bbc45fe98cebb740a5: Status 404 returned error can't find the container with id aa685d777a5955963792702d24838beb5e04a3ad3ad065bbc45fe98cebb740a5 Apr 16 15:02:46.812047 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.812013 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" event={"ID":"efd37792-1227-4c2f-bb2d-107ec2d1b408","Type":"ContainerStarted","Data":"754a52d7ea17203ff23aa77d61e52800d5d9f9cd269a0be40f74f19f012faf97"} Apr 16 15:02:46.812193 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:46.812054 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" event={"ID":"efd37792-1227-4c2f-bb2d-107ec2d1b408","Type":"ContainerStarted","Data":"aa685d777a5955963792702d24838beb5e04a3ad3ad065bbc45fe98cebb740a5"} Apr 16 15:02:47.816912 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:47.816822 2568 generic.go:358] "Generic (PLEG): container finished" podID="efd37792-1227-4c2f-bb2d-107ec2d1b408" containerID="754a52d7ea17203ff23aa77d61e52800d5d9f9cd269a0be40f74f19f012faf97" exitCode=0 Apr 16 15:02:47.817340 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:47.816914 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" event={"ID":"efd37792-1227-4c2f-bb2d-107ec2d1b408","Type":"ContainerDied","Data":"754a52d7ea17203ff23aa77d61e52800d5d9f9cd269a0be40f74f19f012faf97"} Apr 16 15:02:50.829425 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:50.829395 2568 generic.go:358] "Generic (PLEG): container finished" podID="efd37792-1227-4c2f-bb2d-107ec2d1b408" containerID="a61db4d189671c8dfc8c955531f443c79ad332af640c5e90ffde5b93546323a2" exitCode=0 Apr 16 15:02:50.829788 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:50.829482 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" event={"ID":"efd37792-1227-4c2f-bb2d-107ec2d1b408","Type":"ContainerDied","Data":"a61db4d189671c8dfc8c955531f443c79ad332af640c5e90ffde5b93546323a2"} Apr 16 15:02:51.835345 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:51.835308 2568 generic.go:358] "Generic (PLEG): container finished" podID="efd37792-1227-4c2f-bb2d-107ec2d1b408" containerID="8aa1943ad9c460431ada2e4257a1f74945e5c35b8d7cb102b5b066a1439de69f" exitCode=0 Apr 16 15:02:51.835707 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:51.835380 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" event={"ID":"efd37792-1227-4c2f-bb2d-107ec2d1b408","Type":"ContainerDied","Data":"8aa1943ad9c460431ada2e4257a1f74945e5c35b8d7cb102b5b066a1439de69f"} Apr 16 15:02:52.963953 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:52.963929 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" Apr 16 15:02:53.155210 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:53.155152 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/efd37792-1227-4c2f-bb2d-107ec2d1b408-util\") pod \"efd37792-1227-4c2f-bb2d-107ec2d1b408\" (UID: \"efd37792-1227-4c2f-bb2d-107ec2d1b408\") " Apr 16 15:02:53.155304 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:53.155222 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pdn4\" (UniqueName: \"kubernetes.io/projected/efd37792-1227-4c2f-bb2d-107ec2d1b408-kube-api-access-7pdn4\") pod \"efd37792-1227-4c2f-bb2d-107ec2d1b408\" (UID: \"efd37792-1227-4c2f-bb2d-107ec2d1b408\") " Apr 16 15:02:53.155304 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:53.155250 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/efd37792-1227-4c2f-bb2d-107ec2d1b408-bundle\") pod \"efd37792-1227-4c2f-bb2d-107ec2d1b408\" (UID: \"efd37792-1227-4c2f-bb2d-107ec2d1b408\") " Apr 16 15:02:53.155720 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:53.155694 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efd37792-1227-4c2f-bb2d-107ec2d1b408-bundle" (OuterVolumeSpecName: "bundle") pod "efd37792-1227-4c2f-bb2d-107ec2d1b408" (UID: "efd37792-1227-4c2f-bb2d-107ec2d1b408"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:02:53.157409 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:53.157386 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efd37792-1227-4c2f-bb2d-107ec2d1b408-kube-api-access-7pdn4" (OuterVolumeSpecName: "kube-api-access-7pdn4") pod "efd37792-1227-4c2f-bb2d-107ec2d1b408" (UID: "efd37792-1227-4c2f-bb2d-107ec2d1b408"). InnerVolumeSpecName "kube-api-access-7pdn4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:02:53.160332 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:53.160307 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efd37792-1227-4c2f-bb2d-107ec2d1b408-util" (OuterVolumeSpecName: "util") pod "efd37792-1227-4c2f-bb2d-107ec2d1b408" (UID: "efd37792-1227-4c2f-bb2d-107ec2d1b408"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:02:53.256669 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:53.256643 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7pdn4\" (UniqueName: \"kubernetes.io/projected/efd37792-1227-4c2f-bb2d-107ec2d1b408-kube-api-access-7pdn4\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:02:53.256669 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:53.256666 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/efd37792-1227-4c2f-bb2d-107ec2d1b408-bundle\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:02:53.256788 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:53.256675 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/efd37792-1227-4c2f-bb2d-107ec2d1b408-util\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:02:53.844551 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:53.844520 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" event={"ID":"efd37792-1227-4c2f-bb2d-107ec2d1b408","Type":"ContainerDied","Data":"aa685d777a5955963792702d24838beb5e04a3ad3ad065bbc45fe98cebb740a5"} Apr 16 15:02:53.844680 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:53.844555 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa685d777a5955963792702d24838beb5e04a3ad3ad065bbc45fe98cebb740a5" Apr 16 15:02:53.844680 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:53.844564 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4ghz5" Apr 16 15:02:58.705406 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.705361 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46"] Apr 16 15:02:58.705959 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.705937 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="efd37792-1227-4c2f-bb2d-107ec2d1b408" containerName="util" Apr 16 15:02:58.706035 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.705961 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd37792-1227-4c2f-bb2d-107ec2d1b408" containerName="util" Apr 16 15:02:58.706035 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.705979 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="efd37792-1227-4c2f-bb2d-107ec2d1b408" containerName="extract" Apr 16 15:02:58.706035 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.705988 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd37792-1227-4c2f-bb2d-107ec2d1b408" containerName="extract" Apr 16 15:02:58.706035 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.706023 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="efd37792-1227-4c2f-bb2d-107ec2d1b408" containerName="pull" Apr 16 15:02:58.706035 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.706032 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd37792-1227-4c2f-bb2d-107ec2d1b408" containerName="pull" Apr 16 15:02:58.706277 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.706135 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="efd37792-1227-4c2f-bb2d-107ec2d1b408" containerName="extract" Apr 16 15:02:58.712307 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.712283 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46" Apr 16 15:02:58.714993 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.714970 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 15:02:58.715155 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.715109 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-85ftn\"" Apr 16 15:02:58.715328 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.715293 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:02:58.717171 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.717137 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46"] Apr 16 15:02:58.797613 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.797583 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f34243be-0754-4e19-9c0d-be3cddfac28c-tmp\") pod \"openshift-lws-operator-bfc7f696d-x2x46\" (UID: \"f34243be-0754-4e19-9c0d-be3cddfac28c\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46" Apr 16 15:02:58.797721 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.797659 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4gwj\" (UniqueName: \"kubernetes.io/projected/f34243be-0754-4e19-9c0d-be3cddfac28c-kube-api-access-k4gwj\") pod \"openshift-lws-operator-bfc7f696d-x2x46\" (UID: \"f34243be-0754-4e19-9c0d-be3cddfac28c\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46" Apr 16 15:02:58.898308 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.898280 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f34243be-0754-4e19-9c0d-be3cddfac28c-tmp\") pod \"openshift-lws-operator-bfc7f696d-x2x46\" (UID: \"f34243be-0754-4e19-9c0d-be3cddfac28c\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46" Apr 16 15:02:58.898404 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.898333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4gwj\" (UniqueName: \"kubernetes.io/projected/f34243be-0754-4e19-9c0d-be3cddfac28c-kube-api-access-k4gwj\") pod \"openshift-lws-operator-bfc7f696d-x2x46\" (UID: \"f34243be-0754-4e19-9c0d-be3cddfac28c\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46" Apr 16 15:02:58.898761 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.898742 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f34243be-0754-4e19-9c0d-be3cddfac28c-tmp\") pod \"openshift-lws-operator-bfc7f696d-x2x46\" (UID: \"f34243be-0754-4e19-9c0d-be3cddfac28c\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46" Apr 16 15:02:58.908536 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:58.908509 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4gwj\" (UniqueName: \"kubernetes.io/projected/f34243be-0754-4e19-9c0d-be3cddfac28c-kube-api-access-k4gwj\") pod \"openshift-lws-operator-bfc7f696d-x2x46\" (UID: \"f34243be-0754-4e19-9c0d-be3cddfac28c\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46" Apr 16 15:02:59.036673 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:59.036608 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46" Apr 16 15:02:59.158345 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:59.158322 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46"] Apr 16 15:02:59.160279 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:02:59.160256 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf34243be_0754_4e19_9c0d_be3cddfac28c.slice/crio-ec2f1b4862567693b52442cbc91a608a61f557cfbd46c5eda97d4871a7a42b6f WatchSource:0}: Error finding container ec2f1b4862567693b52442cbc91a608a61f557cfbd46c5eda97d4871a7a42b6f: Status 404 returned error can't find the container with id ec2f1b4862567693b52442cbc91a608a61f557cfbd46c5eda97d4871a7a42b6f Apr 16 15:02:59.867369 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:02:59.867334 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46" event={"ID":"f34243be-0754-4e19-9c0d-be3cddfac28c","Type":"ContainerStarted","Data":"ec2f1b4862567693b52442cbc91a608a61f557cfbd46c5eda97d4871a7a42b6f"} Apr 16 15:03:01.853566 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:01.853530 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-84769dbb7f-vqdpj" podUID="884e3635-5dae-45b5-bb62-e9806440468c" containerName="console" containerID="cri-o://13907b7ba80c35d61663ad618ffad9fb2debac634ffa577a59408c6ca21847aa" gracePeriod=15 Apr 16 15:03:01.876518 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:01.876496 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46" event={"ID":"f34243be-0754-4e19-9c0d-be3cddfac28c","Type":"ContainerStarted","Data":"64293020f05780cf1ab6e8cc1f9c5589a3b54a23e0125138ca804ee30df56895"} Apr 16 15:03:01.896262 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:01.896203 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x2x46" podStartSLOduration=1.579250004 podStartE2EDuration="3.896187888s" podCreationTimestamp="2026-04-16 15:02:58 +0000 UTC" firstStartedPulling="2026-04-16 15:02:59.161685495 +0000 UTC m=+617.925637584" lastFinishedPulling="2026-04-16 15:03:01.478623365 +0000 UTC m=+620.242575468" observedRunningTime="2026-04-16 15:03:01.893824552 +0000 UTC m=+620.657776663" watchObservedRunningTime="2026-04-16 15:03:01.896187888 +0000 UTC m=+620.660140003" Apr 16 15:03:02.094128 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.094109 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84769dbb7f-vqdpj_884e3635-5dae-45b5-bb62-e9806440468c/console/0.log" Apr 16 15:03:02.094225 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.094166 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 15:03:02.121337 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.121281 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-service-ca\") pod \"884e3635-5dae-45b5-bb62-e9806440468c\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " Apr 16 15:03:02.121337 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.121323 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vzlw\" (UniqueName: \"kubernetes.io/projected/884e3635-5dae-45b5-bb62-e9806440468c-kube-api-access-8vzlw\") pod \"884e3635-5dae-45b5-bb62-e9806440468c\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " Apr 16 15:03:02.121487 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.121363 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-trusted-ca-bundle\") pod \"884e3635-5dae-45b5-bb62-e9806440468c\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " Apr 16 15:03:02.121487 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.121410 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/884e3635-5dae-45b5-bb62-e9806440468c-console-oauth-config\") pod \"884e3635-5dae-45b5-bb62-e9806440468c\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " Apr 16 15:03:02.121487 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.121468 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-oauth-serving-cert\") pod \"884e3635-5dae-45b5-bb62-e9806440468c\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " Apr 16 15:03:02.121633 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.121522 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-console-config\") pod \"884e3635-5dae-45b5-bb62-e9806440468c\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " Apr 16 15:03:02.121688 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.121648 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/884e3635-5dae-45b5-bb62-e9806440468c-console-serving-cert\") pod \"884e3635-5dae-45b5-bb62-e9806440468c\" (UID: \"884e3635-5dae-45b5-bb62-e9806440468c\") " Apr 16 15:03:02.121845 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.121816 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "884e3635-5dae-45b5-bb62-e9806440468c" (UID: "884e3635-5dae-45b5-bb62-e9806440468c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:03:02.122054 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.122036 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-trusted-ca-bundle\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:03:02.122268 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.122233 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-console-config" (OuterVolumeSpecName: "console-config") pod "884e3635-5dae-45b5-bb62-e9806440468c" (UID: "884e3635-5dae-45b5-bb62-e9806440468c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:03:02.122388 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.122347 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-service-ca" (OuterVolumeSpecName: "service-ca") pod "884e3635-5dae-45b5-bb62-e9806440468c" (UID: "884e3635-5dae-45b5-bb62-e9806440468c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:03:02.122513 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.122464 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "884e3635-5dae-45b5-bb62-e9806440468c" (UID: "884e3635-5dae-45b5-bb62-e9806440468c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:03:02.123666 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.123638 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884e3635-5dae-45b5-bb62-e9806440468c-kube-api-access-8vzlw" (OuterVolumeSpecName: "kube-api-access-8vzlw") pod "884e3635-5dae-45b5-bb62-e9806440468c" (UID: "884e3635-5dae-45b5-bb62-e9806440468c"). InnerVolumeSpecName "kube-api-access-8vzlw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:03:02.124149 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.124123 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884e3635-5dae-45b5-bb62-e9806440468c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "884e3635-5dae-45b5-bb62-e9806440468c" (UID: "884e3635-5dae-45b5-bb62-e9806440468c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:03:02.124486 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.124467 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884e3635-5dae-45b5-bb62-e9806440468c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "884e3635-5dae-45b5-bb62-e9806440468c" (UID: "884e3635-5dae-45b5-bb62-e9806440468c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:03:02.223248 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.223225 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-oauth-serving-cert\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:03:02.223329 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.223251 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-console-config\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:03:02.223329 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.223266 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/884e3635-5dae-45b5-bb62-e9806440468c-console-serving-cert\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:03:02.223329 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.223280 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/884e3635-5dae-45b5-bb62-e9806440468c-service-ca\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:03:02.223329 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.223295 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8vzlw\" (UniqueName: \"kubernetes.io/projected/884e3635-5dae-45b5-bb62-e9806440468c-kube-api-access-8vzlw\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:03:02.223329 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.223309 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/884e3635-5dae-45b5-bb62-e9806440468c-console-oauth-config\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:03:02.881346 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.881318 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84769dbb7f-vqdpj_884e3635-5dae-45b5-bb62-e9806440468c/console/0.log" Apr 16 15:03:02.881797 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.881355 2568 generic.go:358] "Generic (PLEG): container finished" podID="884e3635-5dae-45b5-bb62-e9806440468c" containerID="13907b7ba80c35d61663ad618ffad9fb2debac634ffa577a59408c6ca21847aa" exitCode=2 Apr 16 15:03:02.881797 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.881423 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84769dbb7f-vqdpj" Apr 16 15:03:02.881797 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.881443 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84769dbb7f-vqdpj" event={"ID":"884e3635-5dae-45b5-bb62-e9806440468c","Type":"ContainerDied","Data":"13907b7ba80c35d61663ad618ffad9fb2debac634ffa577a59408c6ca21847aa"} Apr 16 15:03:02.881797 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.881490 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84769dbb7f-vqdpj" event={"ID":"884e3635-5dae-45b5-bb62-e9806440468c","Type":"ContainerDied","Data":"063b6b48c55335e08f1a7d5cff6ab37b788daec05aff0935e7bb90a4b601ffae"} Apr 16 15:03:02.881797 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.881512 2568 scope.go:117] "RemoveContainer" containerID="13907b7ba80c35d61663ad618ffad9fb2debac634ffa577a59408c6ca21847aa" Apr 16 15:03:02.889870 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.889856 2568 scope.go:117] "RemoveContainer" containerID="13907b7ba80c35d61663ad618ffad9fb2debac634ffa577a59408c6ca21847aa" Apr 16 15:03:02.890113 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:03:02.890095 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13907b7ba80c35d61663ad618ffad9fb2debac634ffa577a59408c6ca21847aa\": container with ID starting with 13907b7ba80c35d61663ad618ffad9fb2debac634ffa577a59408c6ca21847aa not found: ID does not exist" containerID="13907b7ba80c35d61663ad618ffad9fb2debac634ffa577a59408c6ca21847aa" Apr 16 15:03:02.890162 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.890123 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13907b7ba80c35d61663ad618ffad9fb2debac634ffa577a59408c6ca21847aa"} err="failed to get container status \"13907b7ba80c35d61663ad618ffad9fb2debac634ffa577a59408c6ca21847aa\": rpc error: code = NotFound desc = could not find container \"13907b7ba80c35d61663ad618ffad9fb2debac634ffa577a59408c6ca21847aa\": container with ID starting with 13907b7ba80c35d61663ad618ffad9fb2debac634ffa577a59408c6ca21847aa not found: ID does not exist" Apr 16 15:03:02.901636 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.901612 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84769dbb7f-vqdpj"] Apr 16 15:03:02.904463 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:02.904444 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-84769dbb7f-vqdpj"] Apr 16 15:03:03.863745 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:03.863711 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="884e3635-5dae-45b5-bb62-e9806440468c" path="/var/lib/kubelet/pods/884e3635-5dae-45b5-bb62-e9806440468c/volumes" Apr 16 15:03:13.496777 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.496744 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx"] Apr 16 15:03:13.497244 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.497088 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="884e3635-5dae-45b5-bb62-e9806440468c" containerName="console" Apr 16 15:03:13.497244 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.497099 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="884e3635-5dae-45b5-bb62-e9806440468c" containerName="console" Apr 16 15:03:13.497244 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.497158 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="884e3635-5dae-45b5-bb62-e9806440468c" containerName="console" Apr 16 15:03:13.501623 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.501605 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" Apr 16 15:03:13.504087 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.504023 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:03:13.504234 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.504050 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:03:13.505387 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.505202 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-pzc87\"" Apr 16 15:03:13.507961 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.507941 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx"] Apr 16 15:03:13.599277 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.599256 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e477ef79-42eb-4650-9925-ecff5c1a9a67-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx\" (UID: \"e477ef79-42eb-4650-9925-ecff5c1a9a67\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" Apr 16 15:03:13.599379 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.599289 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e477ef79-42eb-4650-9925-ecff5c1a9a67-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx\" (UID: \"e477ef79-42eb-4650-9925-ecff5c1a9a67\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" Apr 16 15:03:13.599379 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.599312 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh2pr\" (UniqueName: \"kubernetes.io/projected/e477ef79-42eb-4650-9925-ecff5c1a9a67-kube-api-access-hh2pr\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx\" (UID: \"e477ef79-42eb-4650-9925-ecff5c1a9a67\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" Apr 16 15:03:13.699685 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.699661 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e477ef79-42eb-4650-9925-ecff5c1a9a67-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx\" (UID: \"e477ef79-42eb-4650-9925-ecff5c1a9a67\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" Apr 16 15:03:13.699776 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.699694 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e477ef79-42eb-4650-9925-ecff5c1a9a67-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx\" (UID: \"e477ef79-42eb-4650-9925-ecff5c1a9a67\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" Apr 16 15:03:13.699776 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.699713 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hh2pr\" (UniqueName: \"kubernetes.io/projected/e477ef79-42eb-4650-9925-ecff5c1a9a67-kube-api-access-hh2pr\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx\" (UID: \"e477ef79-42eb-4650-9925-ecff5c1a9a67\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" Apr 16 15:03:13.700038 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.700020 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e477ef79-42eb-4650-9925-ecff5c1a9a67-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx\" (UID: \"e477ef79-42eb-4650-9925-ecff5c1a9a67\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" Apr 16 15:03:13.700117 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.700099 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e477ef79-42eb-4650-9925-ecff5c1a9a67-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx\" (UID: \"e477ef79-42eb-4650-9925-ecff5c1a9a67\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" Apr 16 15:03:13.707500 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.707478 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh2pr\" (UniqueName: \"kubernetes.io/projected/e477ef79-42eb-4650-9925-ecff5c1a9a67-kube-api-access-hh2pr\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx\" (UID: \"e477ef79-42eb-4650-9925-ecff5c1a9a67\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" Apr 16 15:03:13.812196 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.812144 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" Apr 16 15:03:13.937613 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:13.937585 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx"] Apr 16 15:03:13.940224 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:03:13.940199 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode477ef79_42eb_4650_9925_ecff5c1a9a67.slice/crio-1f9b704835aaede93a6dfc05639aec742fdecb6a18303e93c361e19a73970c24 WatchSource:0}: Error finding container 1f9b704835aaede93a6dfc05639aec742fdecb6a18303e93c361e19a73970c24: Status 404 returned error can't find the container with id 1f9b704835aaede93a6dfc05639aec742fdecb6a18303e93c361e19a73970c24 Apr 16 15:03:14.925991 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:14.925950 2568 generic.go:358] "Generic (PLEG): container finished" podID="e477ef79-42eb-4650-9925-ecff5c1a9a67" containerID="53c36c52dc1f37df504466ffc8cce0ff5634fecd9f6c9c9bbe7f33cb0ab50923" exitCode=0 Apr 16 15:03:14.926346 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:14.926037 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" event={"ID":"e477ef79-42eb-4650-9925-ecff5c1a9a67","Type":"ContainerDied","Data":"53c36c52dc1f37df504466ffc8cce0ff5634fecd9f6c9c9bbe7f33cb0ab50923"} Apr 16 15:03:14.926346 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:14.926076 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" event={"ID":"e477ef79-42eb-4650-9925-ecff5c1a9a67","Type":"ContainerStarted","Data":"1f9b704835aaede93a6dfc05639aec742fdecb6a18303e93c361e19a73970c24"} Apr 16 15:03:15.931710 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:15.931682 2568 generic.go:358] "Generic (PLEG): container finished" podID="e477ef79-42eb-4650-9925-ecff5c1a9a67" containerID="e31f77a76c9a35ecd4bf8a463f4387780271e968c935653b8696222d401a20a0" exitCode=0 Apr 16 15:03:15.932057 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:15.931766 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" event={"ID":"e477ef79-42eb-4650-9925-ecff5c1a9a67","Type":"ContainerDied","Data":"e31f77a76c9a35ecd4bf8a463f4387780271e968c935653b8696222d401a20a0"} Apr 16 15:03:16.937043 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:16.937008 2568 generic.go:358] "Generic (PLEG): container finished" podID="e477ef79-42eb-4650-9925-ecff5c1a9a67" containerID="2a098d30d7c33dd4d8ba5fd7d05f6858e441038217042da7b06b29e04b8f4677" exitCode=0 Apr 16 15:03:16.937400 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:16.937081 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" event={"ID":"e477ef79-42eb-4650-9925-ecff5c1a9a67","Type":"ContainerDied","Data":"2a098d30d7c33dd4d8ba5fd7d05f6858e441038217042da7b06b29e04b8f4677"} Apr 16 15:03:18.064547 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:18.064529 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" Apr 16 15:03:18.134531 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:18.134509 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh2pr\" (UniqueName: \"kubernetes.io/projected/e477ef79-42eb-4650-9925-ecff5c1a9a67-kube-api-access-hh2pr\") pod \"e477ef79-42eb-4650-9925-ecff5c1a9a67\" (UID: \"e477ef79-42eb-4650-9925-ecff5c1a9a67\") " Apr 16 15:03:18.134627 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:18.134555 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e477ef79-42eb-4650-9925-ecff5c1a9a67-util\") pod \"e477ef79-42eb-4650-9925-ecff5c1a9a67\" (UID: \"e477ef79-42eb-4650-9925-ecff5c1a9a67\") " Apr 16 15:03:18.134627 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:18.134586 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e477ef79-42eb-4650-9925-ecff5c1a9a67-bundle\") pod \"e477ef79-42eb-4650-9925-ecff5c1a9a67\" (UID: \"e477ef79-42eb-4650-9925-ecff5c1a9a67\") " Apr 16 15:03:18.135381 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:18.135359 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e477ef79-42eb-4650-9925-ecff5c1a9a67-bundle" (OuterVolumeSpecName: "bundle") pod "e477ef79-42eb-4650-9925-ecff5c1a9a67" (UID: "e477ef79-42eb-4650-9925-ecff5c1a9a67"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:03:18.136432 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:18.136406 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e477ef79-42eb-4650-9925-ecff5c1a9a67-kube-api-access-hh2pr" (OuterVolumeSpecName: "kube-api-access-hh2pr") pod "e477ef79-42eb-4650-9925-ecff5c1a9a67" (UID: "e477ef79-42eb-4650-9925-ecff5c1a9a67"). InnerVolumeSpecName "kube-api-access-hh2pr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:03:18.140177 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:18.140156 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e477ef79-42eb-4650-9925-ecff5c1a9a67-util" (OuterVolumeSpecName: "util") pod "e477ef79-42eb-4650-9925-ecff5c1a9a67" (UID: "e477ef79-42eb-4650-9925-ecff5c1a9a67"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:03:18.236073 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:18.236020 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e477ef79-42eb-4650-9925-ecff5c1a9a67-util\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:03:18.236073 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:18.236041 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e477ef79-42eb-4650-9925-ecff5c1a9a67-bundle\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:03:18.236073 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:18.236052 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hh2pr\" (UniqueName: \"kubernetes.io/projected/e477ef79-42eb-4650-9925-ecff5c1a9a67-kube-api-access-hh2pr\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:03:18.947741 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:18.947705 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" event={"ID":"e477ef79-42eb-4650-9925-ecff5c1a9a67","Type":"ContainerDied","Data":"1f9b704835aaede93a6dfc05639aec742fdecb6a18303e93c361e19a73970c24"} Apr 16 15:03:18.947741 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:18.947746 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f9b704835aaede93a6dfc05639aec742fdecb6a18303e93c361e19a73970c24" Apr 16 15:03:18.947957 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:18.947723 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835t4bdx" Apr 16 15:03:28.466794 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.466759 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8"] Apr 16 15:03:28.467205 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.467113 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e477ef79-42eb-4650-9925-ecff5c1a9a67" containerName="pull" Apr 16 15:03:28.467205 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.467125 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e477ef79-42eb-4650-9925-ecff5c1a9a67" containerName="pull" Apr 16 15:03:28.467205 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.467138 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e477ef79-42eb-4650-9925-ecff5c1a9a67" containerName="util" Apr 16 15:03:28.467205 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.467143 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e477ef79-42eb-4650-9925-ecff5c1a9a67" containerName="util" Apr 16 15:03:28.467205 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.467154 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e477ef79-42eb-4650-9925-ecff5c1a9a67" containerName="extract" Apr 16 15:03:28.467205 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.467160 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e477ef79-42eb-4650-9925-ecff5c1a9a67" containerName="extract" Apr 16 15:03:28.467401 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.467222 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e477ef79-42eb-4650-9925-ecff5c1a9a67" containerName="extract" Apr 16 15:03:28.471542 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.471525 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" Apr 16 15:03:28.474118 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.474093 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:03:28.475073 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.475053 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:03:28.475169 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.475109 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-pzc87\"" Apr 16 15:03:28.480641 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.480611 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8"] Apr 16 15:03:28.512320 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.512300 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e93a9151-b6f3-4e5a-aa82-87de08912ce6-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8\" (UID: \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" Apr 16 15:03:28.512417 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.512330 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfvfj\" (UniqueName: \"kubernetes.io/projected/e93a9151-b6f3-4e5a-aa82-87de08912ce6-kube-api-access-mfvfj\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8\" (UID: \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" Apr 16 15:03:28.512417 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.512350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e93a9151-b6f3-4e5a-aa82-87de08912ce6-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8\" (UID: \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" Apr 16 15:03:28.613078 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.613055 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e93a9151-b6f3-4e5a-aa82-87de08912ce6-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8\" (UID: \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" Apr 16 15:03:28.613169 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.613085 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfvfj\" (UniqueName: \"kubernetes.io/projected/e93a9151-b6f3-4e5a-aa82-87de08912ce6-kube-api-access-mfvfj\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8\" (UID: \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" Apr 16 15:03:28.613169 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.613110 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e93a9151-b6f3-4e5a-aa82-87de08912ce6-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8\" (UID: \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" Apr 16 15:03:28.613474 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.613449 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e93a9151-b6f3-4e5a-aa82-87de08912ce6-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8\" (UID: \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" Apr 16 15:03:28.613515 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.613464 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e93a9151-b6f3-4e5a-aa82-87de08912ce6-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8\" (UID: \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" Apr 16 15:03:28.627547 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.627516 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfvfj\" (UniqueName: \"kubernetes.io/projected/e93a9151-b6f3-4e5a-aa82-87de08912ce6-kube-api-access-mfvfj\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8\" (UID: \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" Apr 16 15:03:28.782716 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:28.782653 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" Apr 16 15:03:29.106494 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:29.106473 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8"] Apr 16 15:03:29.108613 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:03:29.108587 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode93a9151_b6f3_4e5a_aa82_87de08912ce6.slice/crio-803255fed99d6829eb24071e2ed3f8f40b431253c882d4855bc253853b3224a6 WatchSource:0}: Error finding container 803255fed99d6829eb24071e2ed3f8f40b431253c882d4855bc253853b3224a6: Status 404 returned error can't find the container with id 803255fed99d6829eb24071e2ed3f8f40b431253c882d4855bc253853b3224a6 Apr 16 15:03:29.110426 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:29.110405 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:03:29.987209 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:29.987129 2568 generic.go:358] "Generic (PLEG): container finished" podID="e93a9151-b6f3-4e5a-aa82-87de08912ce6" containerID="f38d52eb9479ad3b20c7fa7c9965845cb9fb32e1af5009b65e52611b0e694a59" exitCode=0 Apr 16 15:03:29.987545 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:29.987232 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" event={"ID":"e93a9151-b6f3-4e5a-aa82-87de08912ce6","Type":"ContainerDied","Data":"f38d52eb9479ad3b20c7fa7c9965845cb9fb32e1af5009b65e52611b0e694a59"} Apr 16 15:03:29.987545 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:29.987273 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" event={"ID":"e93a9151-b6f3-4e5a-aa82-87de08912ce6","Type":"ContainerStarted","Data":"803255fed99d6829eb24071e2ed3f8f40b431253c882d4855bc253853b3224a6"} Apr 16 15:03:30.711490 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:30.711465 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-ct5br"] Apr 16 15:03:30.714635 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:30.714620 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ct5br" Apr 16 15:03:30.717063 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:30.717045 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-st5fv\"" Apr 16 15:03:30.717152 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:30.717076 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 15:03:30.717215 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:30.717153 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 15:03:30.730400 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:30.730376 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-ct5br"] Apr 16 15:03:30.831979 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:30.831950 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctzb8\" (UniqueName: \"kubernetes.io/projected/43a1a574-33fd-4361-ab57-67e61d536958-kube-api-access-ctzb8\") pod \"servicemesh-operator3-55f49c5f94-ct5br\" (UID: \"43a1a574-33fd-4361-ab57-67e61d536958\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-ct5br" Apr 16 15:03:30.832080 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:30.831988 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/43a1a574-33fd-4361-ab57-67e61d536958-operator-config\") pod \"servicemesh-operator3-55f49c5f94-ct5br\" (UID: \"43a1a574-33fd-4361-ab57-67e61d536958\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-ct5br" Apr 16 15:03:30.933021 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:30.932961 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctzb8\" (UniqueName: \"kubernetes.io/projected/43a1a574-33fd-4361-ab57-67e61d536958-kube-api-access-ctzb8\") pod \"servicemesh-operator3-55f49c5f94-ct5br\" (UID: \"43a1a574-33fd-4361-ab57-67e61d536958\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-ct5br" Apr 16 15:03:30.933021 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:30.932992 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/43a1a574-33fd-4361-ab57-67e61d536958-operator-config\") pod \"servicemesh-operator3-55f49c5f94-ct5br\" (UID: \"43a1a574-33fd-4361-ab57-67e61d536958\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-ct5br" Apr 16 15:03:30.935440 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:30.935416 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/43a1a574-33fd-4361-ab57-67e61d536958-operator-config\") pod \"servicemesh-operator3-55f49c5f94-ct5br\" (UID: \"43a1a574-33fd-4361-ab57-67e61d536958\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-ct5br" Apr 16 15:03:30.943714 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:30.943695 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctzb8\" (UniqueName: \"kubernetes.io/projected/43a1a574-33fd-4361-ab57-67e61d536958-kube-api-access-ctzb8\") pod \"servicemesh-operator3-55f49c5f94-ct5br\" (UID: \"43a1a574-33fd-4361-ab57-67e61d536958\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-ct5br" Apr 16 15:03:31.000033 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:31.000002 2568 generic.go:358] "Generic (PLEG): container finished" podID="e93a9151-b6f3-4e5a-aa82-87de08912ce6" containerID="d832ce0b629813a690a6293b5995a5d28c252f45271b1ef494c9f80d667c6659" exitCode=0 Apr 16 15:03:31.000384 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:31.000089 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" event={"ID":"e93a9151-b6f3-4e5a-aa82-87de08912ce6","Type":"ContainerDied","Data":"d832ce0b629813a690a6293b5995a5d28c252f45271b1ef494c9f80d667c6659"} Apr 16 15:03:31.072781 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:31.072760 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ct5br" Apr 16 15:03:31.414700 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:31.414674 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-ct5br"] Apr 16 15:03:31.416584 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:03:31.416561 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43a1a574_33fd_4361_ab57_67e61d536958.slice/crio-ea70cd05ee16a5552175147834199336afcca1ea6ae9710407ee05de40a8d3c9 WatchSource:0}: Error finding container ea70cd05ee16a5552175147834199336afcca1ea6ae9710407ee05de40a8d3c9: Status 404 returned error can't find the container with id ea70cd05ee16a5552175147834199336afcca1ea6ae9710407ee05de40a8d3c9 Apr 16 15:03:32.007958 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:32.007870 2568 generic.go:358] "Generic (PLEG): container finished" podID="e93a9151-b6f3-4e5a-aa82-87de08912ce6" containerID="6674f346136bd092e4876c7f5f5ccfd3a3035b0dd316ebe3a92254542a175926" exitCode=0 Apr 16 15:03:32.008299 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:32.007955 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" event={"ID":"e93a9151-b6f3-4e5a-aa82-87de08912ce6","Type":"ContainerDied","Data":"6674f346136bd092e4876c7f5f5ccfd3a3035b0dd316ebe3a92254542a175926"} Apr 16 15:03:32.009334 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:32.009304 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ct5br" event={"ID":"43a1a574-33fd-4361-ab57-67e61d536958","Type":"ContainerStarted","Data":"ea70cd05ee16a5552175147834199336afcca1ea6ae9710407ee05de40a8d3c9"} Apr 16 15:03:33.160152 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:33.160128 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" Apr 16 15:03:33.252263 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:33.252240 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e93a9151-b6f3-4e5a-aa82-87de08912ce6-bundle\") pod \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\" (UID: \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\") " Apr 16 15:03:33.252407 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:33.252345 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e93a9151-b6f3-4e5a-aa82-87de08912ce6-util\") pod \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\" (UID: \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\") " Apr 16 15:03:33.252407 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:33.252377 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfvfj\" (UniqueName: \"kubernetes.io/projected/e93a9151-b6f3-4e5a-aa82-87de08912ce6-kube-api-access-mfvfj\") pod \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\" (UID: \"e93a9151-b6f3-4e5a-aa82-87de08912ce6\") " Apr 16 15:03:33.253321 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:33.253291 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93a9151-b6f3-4e5a-aa82-87de08912ce6-bundle" (OuterVolumeSpecName: "bundle") pod "e93a9151-b6f3-4e5a-aa82-87de08912ce6" (UID: "e93a9151-b6f3-4e5a-aa82-87de08912ce6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:03:33.254706 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:33.254677 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93a9151-b6f3-4e5a-aa82-87de08912ce6-kube-api-access-mfvfj" (OuterVolumeSpecName: "kube-api-access-mfvfj") pod "e93a9151-b6f3-4e5a-aa82-87de08912ce6" (UID: "e93a9151-b6f3-4e5a-aa82-87de08912ce6"). InnerVolumeSpecName "kube-api-access-mfvfj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:03:33.259625 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:33.259595 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93a9151-b6f3-4e5a-aa82-87de08912ce6-util" (OuterVolumeSpecName: "util") pod "e93a9151-b6f3-4e5a-aa82-87de08912ce6" (UID: "e93a9151-b6f3-4e5a-aa82-87de08912ce6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:03:33.353574 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:33.353550 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e93a9151-b6f3-4e5a-aa82-87de08912ce6-util\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:03:33.353682 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:33.353574 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mfvfj\" (UniqueName: \"kubernetes.io/projected/e93a9151-b6f3-4e5a-aa82-87de08912ce6-kube-api-access-mfvfj\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:03:33.353682 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:33.353591 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e93a9151-b6f3-4e5a-aa82-87de08912ce6-bundle\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:03:34.020604 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:34.020566 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" event={"ID":"e93a9151-b6f3-4e5a-aa82-87de08912ce6","Type":"ContainerDied","Data":"803255fed99d6829eb24071e2ed3f8f40b431253c882d4855bc253853b3224a6"} Apr 16 15:03:34.020604 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:34.020597 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2nwkv8" Apr 16 15:03:34.020835 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:34.020605 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="803255fed99d6829eb24071e2ed3f8f40b431253c882d4855bc253853b3224a6" Apr 16 15:03:35.031570 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:35.031527 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ct5br" event={"ID":"43a1a574-33fd-4361-ab57-67e61d536958","Type":"ContainerStarted","Data":"5809211e7f8c9a11b1ccb6e72c915cfefd03ed4d1800f6a88e8aaf320f57bbcd"} Apr 16 15:03:35.031948 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:35.031622 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ct5br" Apr 16 15:03:35.072351 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:35.072304 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ct5br" podStartSLOduration=2.137400693 podStartE2EDuration="5.072284757s" podCreationTimestamp="2026-04-16 15:03:30 +0000 UTC" firstStartedPulling="2026-04-16 15:03:31.419050829 +0000 UTC m=+650.183002917" lastFinishedPulling="2026-04-16 15:03:34.35393489 +0000 UTC m=+653.117886981" observedRunningTime="2026-04-16 15:03:35.069782459 +0000 UTC m=+653.833734569" watchObservedRunningTime="2026-04-16 15:03:35.072284757 +0000 UTC m=+653.836236868" Apr 16 15:03:38.541648 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.541613 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm"] Apr 16 15:03:38.542039 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.542023 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e93a9151-b6f3-4e5a-aa82-87de08912ce6" containerName="pull" Apr 16 15:03:38.542082 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.542040 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93a9151-b6f3-4e5a-aa82-87de08912ce6" containerName="pull" Apr 16 15:03:38.542082 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.542053 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e93a9151-b6f3-4e5a-aa82-87de08912ce6" containerName="util" Apr 16 15:03:38.542082 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.542059 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93a9151-b6f3-4e5a-aa82-87de08912ce6" containerName="util" Apr 16 15:03:38.542082 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.542065 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e93a9151-b6f3-4e5a-aa82-87de08912ce6" containerName="extract" Apr 16 15:03:38.542082 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.542070 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93a9151-b6f3-4e5a-aa82-87de08912ce6" containerName="extract" Apr 16 15:03:38.542220 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.542133 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e93a9151-b6f3-4e5a-aa82-87de08912ce6" containerName="extract" Apr 16 15:03:38.544301 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.544286 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.546650 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.546629 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 15:03:38.546774 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.546707 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 15:03:38.546774 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.546723 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-kxgv9\"" Apr 16 15:03:38.546853 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.546816 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 15:03:38.546928 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.546877 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 15:03:38.560552 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.560532 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm"] Apr 16 15:03:38.597188 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.597164 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.597282 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.597196 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.597282 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.597228 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtjp8\" (UniqueName: \"kubernetes.io/projected/ec39bede-46a4-420f-88a8-861d4ce1d289-kube-api-access-vtjp8\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.597364 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.597306 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/ec39bede-46a4-420f-88a8-861d4ce1d289-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.597410 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.597363 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.597448 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.597422 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.597486 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.597446 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.697951 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.697925 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.698042 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.697962 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.698042 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.697999 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtjp8\" (UniqueName: \"kubernetes.io/projected/ec39bede-46a4-420f-88a8-861d4ce1d289-kube-api-access-vtjp8\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.698042 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.698024 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/ec39bede-46a4-420f-88a8-861d4ce1d289-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.698198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.698068 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.698198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.698121 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.698198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.698150 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.698650 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.698623 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.700399 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.700372 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/ec39bede-46a4-420f-88a8-861d4ce1d289-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.700662 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.700638 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.700838 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.700823 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.700886 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.700870 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.705698 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.705669 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.706472 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.706446 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtjp8\" (UniqueName: \"kubernetes.io/projected/ec39bede-46a4-420f-88a8-861d4ce1d289-kube-api-access-vtjp8\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lxfm\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.853523 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.853492 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:38.984922 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:38.984879 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm"] Apr 16 15:03:38.986772 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:03:38.986738 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec39bede_46a4_420f_88a8_861d4ce1d289.slice/crio-bcc77456c687bd22ceab202f99b05db94a6f2babf71c4b5c2c4b4a2ba429f412 WatchSource:0}: Error finding container bcc77456c687bd22ceab202f99b05db94a6f2babf71c4b5c2c4b4a2ba429f412: Status 404 returned error can't find the container with id bcc77456c687bd22ceab202f99b05db94a6f2babf71c4b5c2c4b4a2ba429f412 Apr 16 15:03:39.046462 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:39.046435 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" event={"ID":"ec39bede-46a4-420f-88a8-861d4ce1d289","Type":"ContainerStarted","Data":"bcc77456c687bd22ceab202f99b05db94a6f2babf71c4b5c2c4b4a2ba429f412"} Apr 16 15:03:42.217463 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:42.217425 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:03:42.217759 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:42.217491 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:03:43.070424 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:43.070387 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" event={"ID":"ec39bede-46a4-420f-88a8-861d4ce1d289","Type":"ContainerStarted","Data":"3416e3e3d8b6f6d43459dadcb286b25d156b10147ea3ddb7d3fec6d0542950f0"} Apr 16 15:03:43.070611 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:43.070546 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:43.072414 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:43.072377 2568 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-8lxfm container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 15:03:43.072530 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:43.072490 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" podUID="ec39bede-46a4-420f-88a8-861d4ce1d289" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:03:43.098654 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:43.098601 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" podStartSLOduration=1.870407269 podStartE2EDuration="5.098583718s" podCreationTimestamp="2026-04-16 15:03:38 +0000 UTC" firstStartedPulling="2026-04-16 15:03:38.989023831 +0000 UTC m=+657.752975924" lastFinishedPulling="2026-04-16 15:03:42.217200278 +0000 UTC m=+660.981152373" observedRunningTime="2026-04-16 15:03:43.097002435 +0000 UTC m=+661.860954560" watchObservedRunningTime="2026-04-16 15:03:43.098583718 +0000 UTC m=+661.862535830" Apr 16 15:03:44.075105 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:44.075078 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:03:45.067462 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.067430 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg"] Apr 16 15:03:45.070019 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.069997 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.072327 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.072310 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-2rpmb\"" Apr 16 15:03:45.088222 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.088192 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg"] Apr 16 15:03:45.157501 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.157474 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-credential-socket\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.157605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.157511 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istio-token\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.157605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.157528 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.157781 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.157762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-workload-socket\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.157823 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.157798 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-workload-certs\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.157862 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.157825 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp554\" (UniqueName: \"kubernetes.io/projected/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-kube-api-access-cp554\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.157989 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.157970 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.158056 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.158031 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.158111 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.158063 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istio-data\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.259206 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.259178 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-workload-socket\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.259339 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.259214 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-workload-certs\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.259339 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.259234 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cp554\" (UniqueName: \"kubernetes.io/projected/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-kube-api-access-cp554\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.259339 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.259294 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.259511 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.259338 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.259511 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.259361 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istio-data\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.259511 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.259379 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-credential-socket\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.259511 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.259406 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istio-token\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.259511 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.259430 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.259765 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.259596 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-workload-socket\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.259821 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.259762 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-workload-certs\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.259875 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.259833 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-credential-socket\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.260036 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.260008 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istio-data\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.260303 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.260284 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.261619 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.261595 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.262518 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.262495 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.267190 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.267160 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-istio-token\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.267415 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.267400 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp554\" (UniqueName: \"kubernetes.io/projected/9a0632c9-d4c8-48da-aaf3-57cd0552c7f1-kube-api-access-cp554\") pod \"openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg\" (UID: \"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.382253 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.382227 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:45.519317 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:45.519285 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg"] Apr 16 15:03:45.519675 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:03:45.519651 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a0632c9_d4c8_48da_aaf3_57cd0552c7f1.slice/crio-298f75e735b1edc0544212e2ba697bb4a2d09c78a7998066090a7a5c84252088 WatchSource:0}: Error finding container 298f75e735b1edc0544212e2ba697bb4a2d09c78a7998066090a7a5c84252088: Status 404 returned error can't find the container with id 298f75e735b1edc0544212e2ba697bb4a2d09c78a7998066090a7a5c84252088 Apr 16 15:03:46.037180 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:46.037157 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ct5br" Apr 16 15:03:46.083658 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:46.083627 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" event={"ID":"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1","Type":"ContainerStarted","Data":"298f75e735b1edc0544212e2ba697bb4a2d09c78a7998066090a7a5c84252088"} Apr 16 15:03:47.990476 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:47.990434 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:03:47.990772 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:47.990522 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:03:47.990772 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:47.990558 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:03:48.092864 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:48.092829 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" event={"ID":"9a0632c9-d4c8-48da-aaf3-57cd0552c7f1","Type":"ContainerStarted","Data":"534c3497553f49536e1f2397e61981f5cc42b9a98b00486a7aca68524af07503"} Apr 16 15:03:48.118606 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:48.118554 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" podStartSLOduration=0.649963821 podStartE2EDuration="3.118534971s" podCreationTimestamp="2026-04-16 15:03:45 +0000 UTC" firstStartedPulling="2026-04-16 15:03:45.521602441 +0000 UTC m=+664.285554549" lastFinishedPulling="2026-04-16 15:03:47.990173598 +0000 UTC m=+666.754125699" observedRunningTime="2026-04-16 15:03:48.115941396 +0000 UTC m=+666.879893506" watchObservedRunningTime="2026-04-16 15:03:48.118534971 +0000 UTC m=+666.882487083" Apr 16 15:03:48.382680 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:48.382653 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:49.387538 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:49.387514 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:50.100937 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:50.100907 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:50.101962 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:50.101939 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg" Apr 16 15:03:58.463116 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.463081 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b"] Apr 16 15:03:58.466119 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.466098 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" Apr 16 15:03:58.468309 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.468288 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:03:58.469176 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.469156 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-pzc87\"" Apr 16 15:03:58.469277 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.469164 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:03:58.474344 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.474323 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b"] Apr 16 15:03:58.554864 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.554841 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns"] Apr 16 15:03:58.557648 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.557633 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" Apr 16 15:03:58.566404 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.566385 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns"] Apr 16 15:03:58.568523 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.568502 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt87q\" (UniqueName: \"kubernetes.io/projected/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-kube-api-access-zt87q\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b\" (UID: \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" Apr 16 15:03:58.568620 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.568538 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b\" (UID: \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" Apr 16 15:03:58.568620 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.568566 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b\" (UID: \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" Apr 16 15:03:58.655503 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.655476 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68"] Apr 16 15:03:58.658168 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.658151 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" Apr 16 15:03:58.666261 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.666244 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68"] Apr 16 15:03:58.669770 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.669746 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/851393da-80fd-45bf-8392-86f54152e8c7-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns\" (UID: \"851393da-80fd-45bf-8392-86f54152e8c7\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" Apr 16 15:03:58.669876 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.669793 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zt87q\" (UniqueName: \"kubernetes.io/projected/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-kube-api-access-zt87q\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b\" (UID: \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" Apr 16 15:03:58.669876 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.669820 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b\" (UID: \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" Apr 16 15:03:58.669876 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.669837 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b\" (UID: \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" Apr 16 15:03:58.669876 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.669863 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/851393da-80fd-45bf-8392-86f54152e8c7-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns\" (UID: \"851393da-80fd-45bf-8392-86f54152e8c7\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" Apr 16 15:03:58.670130 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.669944 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-849t7\" (UniqueName: \"kubernetes.io/projected/851393da-80fd-45bf-8392-86f54152e8c7-kube-api-access-849t7\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns\" (UID: \"851393da-80fd-45bf-8392-86f54152e8c7\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" Apr 16 15:03:58.670194 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.670130 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b\" (UID: \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" Apr 16 15:03:58.670194 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.670172 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b\" (UID: \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" Apr 16 15:03:58.679447 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.679429 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt87q\" (UniqueName: \"kubernetes.io/projected/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-kube-api-access-zt87q\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b\" (UID: \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" Apr 16 15:03:58.760187 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.760138 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8"] Apr 16 15:03:58.762845 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.762831 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" Apr 16 15:03:58.771143 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.771116 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-849t7\" (UniqueName: \"kubernetes.io/projected/851393da-80fd-45bf-8392-86f54152e8c7-kube-api-access-849t7\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns\" (UID: \"851393da-80fd-45bf-8392-86f54152e8c7\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" Apr 16 15:03:58.771241 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.771163 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8407eca-1912-4a0e-baa8-d44dce585e02-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68\" (UID: \"d8407eca-1912-4a0e-baa8-d44dce585e02\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" Apr 16 15:03:58.771304 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.771286 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58cz9\" (UniqueName: \"kubernetes.io/projected/d8407eca-1912-4a0e-baa8-d44dce585e02-kube-api-access-58cz9\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68\" (UID: \"d8407eca-1912-4a0e-baa8-d44dce585e02\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" Apr 16 15:03:58.771376 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.771357 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/851393da-80fd-45bf-8392-86f54152e8c7-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns\" (UID: \"851393da-80fd-45bf-8392-86f54152e8c7\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" Apr 16 15:03:58.771437 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.771398 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8407eca-1912-4a0e-baa8-d44dce585e02-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68\" (UID: \"d8407eca-1912-4a0e-baa8-d44dce585e02\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" Apr 16 15:03:58.771497 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.771456 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/851393da-80fd-45bf-8392-86f54152e8c7-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns\" (UID: \"851393da-80fd-45bf-8392-86f54152e8c7\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" Apr 16 15:03:58.771753 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.771733 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/851393da-80fd-45bf-8392-86f54152e8c7-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns\" (UID: \"851393da-80fd-45bf-8392-86f54152e8c7\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" Apr 16 15:03:58.771844 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.771791 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/851393da-80fd-45bf-8392-86f54152e8c7-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns\" (UID: \"851393da-80fd-45bf-8392-86f54152e8c7\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" Apr 16 15:03:58.771844 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.771825 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8"] Apr 16 15:03:58.776253 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.776234 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" Apr 16 15:03:58.778407 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.778386 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-849t7\" (UniqueName: \"kubernetes.io/projected/851393da-80fd-45bf-8392-86f54152e8c7-kube-api-access-849t7\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns\" (UID: \"851393da-80fd-45bf-8392-86f54152e8c7\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" Apr 16 15:03:58.867356 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.867331 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" Apr 16 15:03:58.872589 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.872539 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8407eca-1912-4a0e-baa8-d44dce585e02-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68\" (UID: \"d8407eca-1912-4a0e-baa8-d44dce585e02\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" Apr 16 15:03:58.872589 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.872582 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58cz9\" (UniqueName: \"kubernetes.io/projected/d8407eca-1912-4a0e-baa8-d44dce585e02-kube-api-access-58cz9\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68\" (UID: \"d8407eca-1912-4a0e-baa8-d44dce585e02\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" Apr 16 15:03:58.872768 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.872650 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/366dd92c-6958-4eb2-a717-1e5be3b13882-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8\" (UID: \"366dd92c-6958-4eb2-a717-1e5be3b13882\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" Apr 16 15:03:58.872768 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.872696 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8407eca-1912-4a0e-baa8-d44dce585e02-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68\" (UID: \"d8407eca-1912-4a0e-baa8-d44dce585e02\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" Apr 16 15:03:58.872768 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.872760 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/366dd92c-6958-4eb2-a717-1e5be3b13882-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8\" (UID: \"366dd92c-6958-4eb2-a717-1e5be3b13882\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" Apr 16 15:03:58.872971 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.872800 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gpkb\" (UniqueName: \"kubernetes.io/projected/366dd92c-6958-4eb2-a717-1e5be3b13882-kube-api-access-4gpkb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8\" (UID: \"366dd92c-6958-4eb2-a717-1e5be3b13882\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" Apr 16 15:03:58.873398 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.873260 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8407eca-1912-4a0e-baa8-d44dce585e02-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68\" (UID: \"d8407eca-1912-4a0e-baa8-d44dce585e02\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" Apr 16 15:03:58.873398 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.873348 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8407eca-1912-4a0e-baa8-d44dce585e02-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68\" (UID: \"d8407eca-1912-4a0e-baa8-d44dce585e02\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" Apr 16 15:03:58.881676 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.881653 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58cz9\" (UniqueName: \"kubernetes.io/projected/d8407eca-1912-4a0e-baa8-d44dce585e02-kube-api-access-58cz9\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68\" (UID: \"d8407eca-1912-4a0e-baa8-d44dce585e02\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" Apr 16 15:03:58.896651 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.896628 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b"] Apr 16 15:03:58.898408 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:03:58.898381 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf71530f8_6a58_4e61_b2ce_126d4bd3a0c5.slice/crio-1b7f2db414fea5b61329d7537c322e03750c11882f5a42f232aad0b890713797 WatchSource:0}: Error finding container 1b7f2db414fea5b61329d7537c322e03750c11882f5a42f232aad0b890713797: Status 404 returned error can't find the container with id 1b7f2db414fea5b61329d7537c322e03750c11882f5a42f232aad0b890713797 Apr 16 15:03:58.967206 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.967178 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" Apr 16 15:03:58.974270 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.974246 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/366dd92c-6958-4eb2-a717-1e5be3b13882-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8\" (UID: \"366dd92c-6958-4eb2-a717-1e5be3b13882\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" Apr 16 15:03:58.974380 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.974317 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/366dd92c-6958-4eb2-a717-1e5be3b13882-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8\" (UID: \"366dd92c-6958-4eb2-a717-1e5be3b13882\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" Apr 16 15:03:58.974380 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.974343 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gpkb\" (UniqueName: \"kubernetes.io/projected/366dd92c-6958-4eb2-a717-1e5be3b13882-kube-api-access-4gpkb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8\" (UID: \"366dd92c-6958-4eb2-a717-1e5be3b13882\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" Apr 16 15:03:58.974909 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.974866 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/366dd92c-6958-4eb2-a717-1e5be3b13882-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8\" (UID: \"366dd92c-6958-4eb2-a717-1e5be3b13882\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" Apr 16 15:03:58.975035 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.974873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/366dd92c-6958-4eb2-a717-1e5be3b13882-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8\" (UID: \"366dd92c-6958-4eb2-a717-1e5be3b13882\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" Apr 16 15:03:58.983708 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:58.983681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gpkb\" (UniqueName: \"kubernetes.io/projected/366dd92c-6958-4eb2-a717-1e5be3b13882-kube-api-access-4gpkb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8\" (UID: \"366dd92c-6958-4eb2-a717-1e5be3b13882\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" Apr 16 15:03:59.011524 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:59.011498 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns"] Apr 16 15:03:59.032850 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:03:59.032821 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod851393da_80fd_45bf_8392_86f54152e8c7.slice/crio-09ef20bbd240043c3b8065adca6b5563f49c904f475b6482395d0109bb63d078 WatchSource:0}: Error finding container 09ef20bbd240043c3b8065adca6b5563f49c904f475b6482395d0109bb63d078: Status 404 returned error can't find the container with id 09ef20bbd240043c3b8065adca6b5563f49c904f475b6482395d0109bb63d078 Apr 16 15:03:59.072685 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:59.072664 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" Apr 16 15:03:59.098392 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:59.098368 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68"] Apr 16 15:03:59.098975 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:03:59.098947 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8407eca_1912_4a0e_baa8_d44dce585e02.slice/crio-0b12c0d1a4fef72250f980e036bb6e670b9c1c9391541fb60583cb32c64d7c0c WatchSource:0}: Error finding container 0b12c0d1a4fef72250f980e036bb6e670b9c1c9391541fb60583cb32c64d7c0c: Status 404 returned error can't find the container with id 0b12c0d1a4fef72250f980e036bb6e670b9c1c9391541fb60583cb32c64d7c0c Apr 16 15:03:59.136019 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:59.135856 2568 generic.go:358] "Generic (PLEG): container finished" podID="f71530f8-6a58-4e61-b2ce-126d4bd3a0c5" containerID="071bc513aa34987aace53c13c73e88627757d5e1f11e5cf9c0e39999a0f7d99e" exitCode=0 Apr 16 15:03:59.136152 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:59.136108 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" event={"ID":"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5","Type":"ContainerDied","Data":"071bc513aa34987aace53c13c73e88627757d5e1f11e5cf9c0e39999a0f7d99e"} Apr 16 15:03:59.136152 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:59.136142 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" event={"ID":"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5","Type":"ContainerStarted","Data":"1b7f2db414fea5b61329d7537c322e03750c11882f5a42f232aad0b890713797"} Apr 16 15:03:59.138947 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:59.138833 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" event={"ID":"851393da-80fd-45bf-8392-86f54152e8c7","Type":"ContainerStarted","Data":"273ad6a4d49bfe322f3e7c76916d6bb726171cbe8b4227af547808e9d4575aaa"} Apr 16 15:03:59.139058 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:59.138964 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" event={"ID":"851393da-80fd-45bf-8392-86f54152e8c7","Type":"ContainerStarted","Data":"09ef20bbd240043c3b8065adca6b5563f49c904f475b6482395d0109bb63d078"} Apr 16 15:03:59.140167 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:59.140137 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" event={"ID":"d8407eca-1912-4a0e-baa8-d44dce585e02","Type":"ContainerStarted","Data":"0b12c0d1a4fef72250f980e036bb6e670b9c1c9391541fb60583cb32c64d7c0c"} Apr 16 15:03:59.206734 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:03:59.206712 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8"] Apr 16 15:03:59.211883 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:03:59.211858 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod366dd92c_6958_4eb2_a717_1e5be3b13882.slice/crio-6150f9e6c1a137fc9c0bdc620a731b9f06c7b6f1514dfdb0680c7f0602d06e11 WatchSource:0}: Error finding container 6150f9e6c1a137fc9c0bdc620a731b9f06c7b6f1514dfdb0680c7f0602d06e11: Status 404 returned error can't find the container with id 6150f9e6c1a137fc9c0bdc620a731b9f06c7b6f1514dfdb0680c7f0602d06e11 Apr 16 15:04:00.144830 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:00.144800 2568 generic.go:358] "Generic (PLEG): container finished" podID="d8407eca-1912-4a0e-baa8-d44dce585e02" containerID="35c0bd8c5ceb4635db3ad8b81941ec6b3abc0bde0cacba7f9589ad84455b28ff" exitCode=0 Apr 16 15:04:00.145192 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:00.144908 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" event={"ID":"d8407eca-1912-4a0e-baa8-d44dce585e02","Type":"ContainerDied","Data":"35c0bd8c5ceb4635db3ad8b81941ec6b3abc0bde0cacba7f9589ad84455b28ff"} Apr 16 15:04:00.146670 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:00.146622 2568 generic.go:358] "Generic (PLEG): container finished" podID="f71530f8-6a58-4e61-b2ce-126d4bd3a0c5" containerID="d5ece412f85cbd88609278ec9ab7c4efe4ecfc72ea3cdc9004406df00cdf475f" exitCode=0 Apr 16 15:04:00.146746 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:00.146670 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" event={"ID":"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5","Type":"ContainerDied","Data":"d5ece412f85cbd88609278ec9ab7c4efe4ecfc72ea3cdc9004406df00cdf475f"} Apr 16 15:04:00.148526 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:00.148504 2568 generic.go:358] "Generic (PLEG): container finished" podID="851393da-80fd-45bf-8392-86f54152e8c7" containerID="273ad6a4d49bfe322f3e7c76916d6bb726171cbe8b4227af547808e9d4575aaa" exitCode=0 Apr 16 15:04:00.148624 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:00.148570 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" event={"ID":"851393da-80fd-45bf-8392-86f54152e8c7","Type":"ContainerDied","Data":"273ad6a4d49bfe322f3e7c76916d6bb726171cbe8b4227af547808e9d4575aaa"} Apr 16 15:04:00.150031 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:00.150013 2568 generic.go:358] "Generic (PLEG): container finished" podID="366dd92c-6958-4eb2-a717-1e5be3b13882" containerID="c0a5c58ef2e34ac7b2849b5f9703036f89bb014e2647827de5180cb4d0e99bf9" exitCode=0 Apr 16 15:04:00.150110 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:00.150050 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" event={"ID":"366dd92c-6958-4eb2-a717-1e5be3b13882","Type":"ContainerDied","Data":"c0a5c58ef2e34ac7b2849b5f9703036f89bb014e2647827de5180cb4d0e99bf9"} Apr 16 15:04:00.150110 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:00.150073 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" event={"ID":"366dd92c-6958-4eb2-a717-1e5be3b13882","Type":"ContainerStarted","Data":"6150f9e6c1a137fc9c0bdc620a731b9f06c7b6f1514dfdb0680c7f0602d06e11"} Apr 16 15:04:01.155684 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:01.155656 2568 generic.go:358] "Generic (PLEG): container finished" podID="851393da-80fd-45bf-8392-86f54152e8c7" containerID="0cdc53336e5c40c8991ad8d9166490f40431812c6da804e050253d177d65ba39" exitCode=0 Apr 16 15:04:01.156049 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:01.155723 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" event={"ID":"851393da-80fd-45bf-8392-86f54152e8c7","Type":"ContainerDied","Data":"0cdc53336e5c40c8991ad8d9166490f40431812c6da804e050253d177d65ba39"} Apr 16 15:04:01.157396 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:01.157373 2568 generic.go:358] "Generic (PLEG): container finished" podID="366dd92c-6958-4eb2-a717-1e5be3b13882" containerID="2c87ad8f5399b56b296eb4e8858d05fd9aef15bfd5bab1945c2e4c6d0ae0face" exitCode=0 Apr 16 15:04:01.157953 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:01.157516 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" event={"ID":"366dd92c-6958-4eb2-a717-1e5be3b13882","Type":"ContainerDied","Data":"2c87ad8f5399b56b296eb4e8858d05fd9aef15bfd5bab1945c2e4c6d0ae0face"} Apr 16 15:04:01.162465 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:01.162442 2568 generic.go:358] "Generic (PLEG): container finished" podID="d8407eca-1912-4a0e-baa8-d44dce585e02" containerID="380aa8bf60fa7ef81690ba959940ef792bdb02a341f0e21a012d94d9b9b120cb" exitCode=0 Apr 16 15:04:01.162600 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:01.162529 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" event={"ID":"d8407eca-1912-4a0e-baa8-d44dce585e02","Type":"ContainerDied","Data":"380aa8bf60fa7ef81690ba959940ef792bdb02a341f0e21a012d94d9b9b120cb"} Apr 16 15:04:01.164374 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:01.164350 2568 generic.go:358] "Generic (PLEG): container finished" podID="f71530f8-6a58-4e61-b2ce-126d4bd3a0c5" containerID="adf400f66d573c8e61902fec8933cc75078edbd94455d60a5b18c398b05b0e72" exitCode=0 Apr 16 15:04:01.164457 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:01.164411 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" event={"ID":"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5","Type":"ContainerDied","Data":"adf400f66d573c8e61902fec8933cc75078edbd94455d60a5b18c398b05b0e72"} Apr 16 15:04:02.169830 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.169804 2568 generic.go:358] "Generic (PLEG): container finished" podID="851393da-80fd-45bf-8392-86f54152e8c7" containerID="8a68b5fd914da2954ab6334e8be1a02fe59499ffa813288022a4d6c3e33e80e1" exitCode=0 Apr 16 15:04:02.170177 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.169871 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" event={"ID":"851393da-80fd-45bf-8392-86f54152e8c7","Type":"ContainerDied","Data":"8a68b5fd914da2954ab6334e8be1a02fe59499ffa813288022a4d6c3e33e80e1"} Apr 16 15:04:02.171684 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.171660 2568 generic.go:358] "Generic (PLEG): container finished" podID="366dd92c-6958-4eb2-a717-1e5be3b13882" containerID="843ad8457a4fd30d74993d2d90a57faeeb93e0f42649aeae7f8e8e17382da868" exitCode=0 Apr 16 15:04:02.171796 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.171739 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" event={"ID":"366dd92c-6958-4eb2-a717-1e5be3b13882","Type":"ContainerDied","Data":"843ad8457a4fd30d74993d2d90a57faeeb93e0f42649aeae7f8e8e17382da868"} Apr 16 15:04:02.173749 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.173713 2568 generic.go:358] "Generic (PLEG): container finished" podID="d8407eca-1912-4a0e-baa8-d44dce585e02" containerID="d7acc4d91af5d15f8592318620e509e61fe8852c26e1f4230e9d996cce934c2c" exitCode=0 Apr 16 15:04:02.173834 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.173758 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" event={"ID":"d8407eca-1912-4a0e-baa8-d44dce585e02","Type":"ContainerDied","Data":"d7acc4d91af5d15f8592318620e509e61fe8852c26e1f4230e9d996cce934c2c"} Apr 16 15:04:02.302957 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.302927 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" Apr 16 15:04:02.406217 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.406198 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-bundle\") pod \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\" (UID: \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\") " Apr 16 15:04:02.406358 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.406249 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt87q\" (UniqueName: \"kubernetes.io/projected/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-kube-api-access-zt87q\") pod \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\" (UID: \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\") " Apr 16 15:04:02.406358 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.406266 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-util\") pod \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\" (UID: \"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5\") " Apr 16 15:04:02.406812 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.406790 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-bundle" (OuterVolumeSpecName: "bundle") pod "f71530f8-6a58-4e61-b2ce-126d4bd3a0c5" (UID: "f71530f8-6a58-4e61-b2ce-126d4bd3a0c5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:04:02.408254 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.408234 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-kube-api-access-zt87q" (OuterVolumeSpecName: "kube-api-access-zt87q") pod "f71530f8-6a58-4e61-b2ce-126d4bd3a0c5" (UID: "f71530f8-6a58-4e61-b2ce-126d4bd3a0c5"). InnerVolumeSpecName "kube-api-access-zt87q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:04:02.411604 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.411573 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-util" (OuterVolumeSpecName: "util") pod "f71530f8-6a58-4e61-b2ce-126d4bd3a0c5" (UID: "f71530f8-6a58-4e61-b2ce-126d4bd3a0c5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:04:02.507256 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.507206 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-bundle\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:04:02.507256 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.507227 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zt87q\" (UniqueName: \"kubernetes.io/projected/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-kube-api-access-zt87q\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:04:02.507256 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:02.507236 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f71530f8-6a58-4e61-b2ce-126d4bd3a0c5-util\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:04:03.178796 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.178768 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" Apr 16 15:04:03.179183 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.178792 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s67b" event={"ID":"f71530f8-6a58-4e61-b2ce-126d4bd3a0c5","Type":"ContainerDied","Data":"1b7f2db414fea5b61329d7537c322e03750c11882f5a42f232aad0b890713797"} Apr 16 15:04:03.179183 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.178819 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b7f2db414fea5b61329d7537c322e03750c11882f5a42f232aad0b890713797" Apr 16 15:04:03.316516 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.316497 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" Apr 16 15:04:03.369862 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.369843 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" Apr 16 15:04:03.373111 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.373092 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" Apr 16 15:04:03.414051 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.414030 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8407eca-1912-4a0e-baa8-d44dce585e02-util\") pod \"d8407eca-1912-4a0e-baa8-d44dce585e02\" (UID: \"d8407eca-1912-4a0e-baa8-d44dce585e02\") " Apr 16 15:04:03.414144 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.414083 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8407eca-1912-4a0e-baa8-d44dce585e02-bundle\") pod \"d8407eca-1912-4a0e-baa8-d44dce585e02\" (UID: \"d8407eca-1912-4a0e-baa8-d44dce585e02\") " Apr 16 15:04:03.414144 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.414137 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58cz9\" (UniqueName: \"kubernetes.io/projected/d8407eca-1912-4a0e-baa8-d44dce585e02-kube-api-access-58cz9\") pod \"d8407eca-1912-4a0e-baa8-d44dce585e02\" (UID: \"d8407eca-1912-4a0e-baa8-d44dce585e02\") " Apr 16 15:04:03.414671 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.414643 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8407eca-1912-4a0e-baa8-d44dce585e02-bundle" (OuterVolumeSpecName: "bundle") pod "d8407eca-1912-4a0e-baa8-d44dce585e02" (UID: "d8407eca-1912-4a0e-baa8-d44dce585e02"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:04:03.416206 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.416175 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8407eca-1912-4a0e-baa8-d44dce585e02-kube-api-access-58cz9" (OuterVolumeSpecName: "kube-api-access-58cz9") pod "d8407eca-1912-4a0e-baa8-d44dce585e02" (UID: "d8407eca-1912-4a0e-baa8-d44dce585e02"). InnerVolumeSpecName "kube-api-access-58cz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:04:03.419528 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.419504 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8407eca-1912-4a0e-baa8-d44dce585e02-util" (OuterVolumeSpecName: "util") pod "d8407eca-1912-4a0e-baa8-d44dce585e02" (UID: "d8407eca-1912-4a0e-baa8-d44dce585e02"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:04:03.514648 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.514599 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/366dd92c-6958-4eb2-a717-1e5be3b13882-util\") pod \"366dd92c-6958-4eb2-a717-1e5be3b13882\" (UID: \"366dd92c-6958-4eb2-a717-1e5be3b13882\") " Apr 16 15:04:03.514719 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.514649 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/851393da-80fd-45bf-8392-86f54152e8c7-bundle\") pod \"851393da-80fd-45bf-8392-86f54152e8c7\" (UID: \"851393da-80fd-45bf-8392-86f54152e8c7\") " Apr 16 15:04:03.514719 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.514665 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/851393da-80fd-45bf-8392-86f54152e8c7-util\") pod \"851393da-80fd-45bf-8392-86f54152e8c7\" (UID: \"851393da-80fd-45bf-8392-86f54152e8c7\") " Apr 16 15:04:03.514719 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.514680 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gpkb\" (UniqueName: \"kubernetes.io/projected/366dd92c-6958-4eb2-a717-1e5be3b13882-kube-api-access-4gpkb\") pod \"366dd92c-6958-4eb2-a717-1e5be3b13882\" (UID: \"366dd92c-6958-4eb2-a717-1e5be3b13882\") " Apr 16 15:04:03.514719 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.514698 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-849t7\" (UniqueName: \"kubernetes.io/projected/851393da-80fd-45bf-8392-86f54152e8c7-kube-api-access-849t7\") pod \"851393da-80fd-45bf-8392-86f54152e8c7\" (UID: \"851393da-80fd-45bf-8392-86f54152e8c7\") " Apr 16 15:04:03.514939 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.514723 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/366dd92c-6958-4eb2-a717-1e5be3b13882-bundle\") pod \"366dd92c-6958-4eb2-a717-1e5be3b13882\" (UID: \"366dd92c-6958-4eb2-a717-1e5be3b13882\") " Apr 16 15:04:03.515097 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.515078 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8407eca-1912-4a0e-baa8-d44dce585e02-util\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:04:03.515179 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.515106 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8407eca-1912-4a0e-baa8-d44dce585e02-bundle\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:04:03.515179 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.515121 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-58cz9\" (UniqueName: \"kubernetes.io/projected/d8407eca-1912-4a0e-baa8-d44dce585e02-kube-api-access-58cz9\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:04:03.515681 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.515495 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/366dd92c-6958-4eb2-a717-1e5be3b13882-bundle" (OuterVolumeSpecName: "bundle") pod "366dd92c-6958-4eb2-a717-1e5be3b13882" (UID: "366dd92c-6958-4eb2-a717-1e5be3b13882"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:04:03.515681 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.515522 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/851393da-80fd-45bf-8392-86f54152e8c7-bundle" (OuterVolumeSpecName: "bundle") pod "851393da-80fd-45bf-8392-86f54152e8c7" (UID: "851393da-80fd-45bf-8392-86f54152e8c7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:04:03.516923 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.516875 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/851393da-80fd-45bf-8392-86f54152e8c7-kube-api-access-849t7" (OuterVolumeSpecName: "kube-api-access-849t7") pod "851393da-80fd-45bf-8392-86f54152e8c7" (UID: "851393da-80fd-45bf-8392-86f54152e8c7"). InnerVolumeSpecName "kube-api-access-849t7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:04:03.517002 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.516937 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/366dd92c-6958-4eb2-a717-1e5be3b13882-kube-api-access-4gpkb" (OuterVolumeSpecName: "kube-api-access-4gpkb") pod "366dd92c-6958-4eb2-a717-1e5be3b13882" (UID: "366dd92c-6958-4eb2-a717-1e5be3b13882"). InnerVolumeSpecName "kube-api-access-4gpkb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:04:03.520308 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.520288 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/851393da-80fd-45bf-8392-86f54152e8c7-util" (OuterVolumeSpecName: "util") pod "851393da-80fd-45bf-8392-86f54152e8c7" (UID: "851393da-80fd-45bf-8392-86f54152e8c7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:04:03.521091 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.521073 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/366dd92c-6958-4eb2-a717-1e5be3b13882-util" (OuterVolumeSpecName: "util") pod "366dd92c-6958-4eb2-a717-1e5be3b13882" (UID: "366dd92c-6958-4eb2-a717-1e5be3b13882"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:04:03.615973 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.615953 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/366dd92c-6958-4eb2-a717-1e5be3b13882-util\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:04:03.615973 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.615970 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/851393da-80fd-45bf-8392-86f54152e8c7-bundle\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:04:03.616168 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.615979 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/851393da-80fd-45bf-8392-86f54152e8c7-util\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:04:03.616168 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.615987 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4gpkb\" (UniqueName: \"kubernetes.io/projected/366dd92c-6958-4eb2-a717-1e5be3b13882-kube-api-access-4gpkb\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:04:03.616168 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.615997 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-849t7\" (UniqueName: \"kubernetes.io/projected/851393da-80fd-45bf-8392-86f54152e8c7-kube-api-access-849t7\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:04:03.616168 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:03.616008 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/366dd92c-6958-4eb2-a717-1e5be3b13882-bundle\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:04:04.184812 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:04.184752 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" Apr 16 15:04:04.184812 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:04.184759 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec886vd68" event={"ID":"d8407eca-1912-4a0e-baa8-d44dce585e02","Type":"ContainerDied","Data":"0b12c0d1a4fef72250f980e036bb6e670b9c1c9391541fb60583cb32c64d7c0c"} Apr 16 15:04:04.184812 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:04.184793 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b12c0d1a4fef72250f980e036bb6e670b9c1c9391541fb60583cb32c64d7c0c" Apr 16 15:04:04.186992 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:04.186974 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" Apr 16 15:04:04.187093 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:04.187047 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30kzhns" event={"ID":"851393da-80fd-45bf-8392-86f54152e8c7","Type":"ContainerDied","Data":"09ef20bbd240043c3b8065adca6b5563f49c904f475b6482395d0109bb63d078"} Apr 16 15:04:04.187093 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:04.187086 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09ef20bbd240043c3b8065adca6b5563f49c904f475b6482395d0109bb63d078" Apr 16 15:04:04.189273 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:04.189252 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" event={"ID":"366dd92c-6958-4eb2-a717-1e5be3b13882","Type":"ContainerDied","Data":"6150f9e6c1a137fc9c0bdc620a731b9f06c7b6f1514dfdb0680c7f0602d06e11"} Apr 16 15:04:04.189356 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:04.189279 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6150f9e6c1a137fc9c0bdc620a731b9f06c7b6f1514dfdb0680c7f0602d06e11" Apr 16 15:04:04.189356 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:04.189304 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2qsb8" Apr 16 15:04:09.247864 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.247825 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-sd5j6"] Apr 16 15:04:09.248292 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248274 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="851393da-80fd-45bf-8392-86f54152e8c7" containerName="util" Apr 16 15:04:09.248351 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248294 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="851393da-80fd-45bf-8392-86f54152e8c7" containerName="util" Apr 16 15:04:09.248351 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248304 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8407eca-1912-4a0e-baa8-d44dce585e02" containerName="util" Apr 16 15:04:09.248351 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248309 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8407eca-1912-4a0e-baa8-d44dce585e02" containerName="util" Apr 16 15:04:09.248351 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248319 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8407eca-1912-4a0e-baa8-d44dce585e02" containerName="pull" Apr 16 15:04:09.248351 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248325 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8407eca-1912-4a0e-baa8-d44dce585e02" containerName="pull" Apr 16 15:04:09.248351 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248333 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="851393da-80fd-45bf-8392-86f54152e8c7" containerName="pull" Apr 16 15:04:09.248351 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248338 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="851393da-80fd-45bf-8392-86f54152e8c7" containerName="pull" Apr 16 15:04:09.248351 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248348 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f71530f8-6a58-4e61-b2ce-126d4bd3a0c5" containerName="util" Apr 16 15:04:09.248351 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248354 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71530f8-6a58-4e61-b2ce-126d4bd3a0c5" containerName="util" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248360 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="366dd92c-6958-4eb2-a717-1e5be3b13882" containerName="pull" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248366 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="366dd92c-6958-4eb2-a717-1e5be3b13882" containerName="pull" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248373 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8407eca-1912-4a0e-baa8-d44dce585e02" containerName="extract" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248378 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8407eca-1912-4a0e-baa8-d44dce585e02" containerName="extract" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248388 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f71530f8-6a58-4e61-b2ce-126d4bd3a0c5" containerName="pull" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248393 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71530f8-6a58-4e61-b2ce-126d4bd3a0c5" containerName="pull" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248399 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f71530f8-6a58-4e61-b2ce-126d4bd3a0c5" containerName="extract" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248403 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71530f8-6a58-4e61-b2ce-126d4bd3a0c5" containerName="extract" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248409 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="851393da-80fd-45bf-8392-86f54152e8c7" containerName="extract" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248414 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="851393da-80fd-45bf-8392-86f54152e8c7" containerName="extract" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248434 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="366dd92c-6958-4eb2-a717-1e5be3b13882" containerName="extract" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248440 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="366dd92c-6958-4eb2-a717-1e5be3b13882" containerName="extract" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248447 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="366dd92c-6958-4eb2-a717-1e5be3b13882" containerName="util" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248453 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="366dd92c-6958-4eb2-a717-1e5be3b13882" containerName="util" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248506 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f71530f8-6a58-4e61-b2ce-126d4bd3a0c5" containerName="extract" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248515 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="366dd92c-6958-4eb2-a717-1e5be3b13882" containerName="extract" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248525 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="851393da-80fd-45bf-8392-86f54152e8c7" containerName="extract" Apr 16 15:04:09.248605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.248531 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8407eca-1912-4a0e-baa8-d44dce585e02" containerName="extract" Apr 16 15:04:09.257684 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.257661 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-sd5j6" Apr 16 15:04:09.260037 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.259983 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 15:04:09.260191 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.260036 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-ltv76\"" Apr 16 15:04:09.260191 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.260100 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 15:04:09.260191 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.260112 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 15:04:09.268348 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.268326 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-sd5j6"] Apr 16 15:04:09.358709 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.358677 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd9h7\" (UniqueName: \"kubernetes.io/projected/3c9b6be4-2165-40f4-9a40-28f06b47739f-kube-api-access-xd9h7\") pod \"dns-operator-controller-manager-844548ff4c-sd5j6\" (UID: \"3c9b6be4-2165-40f4-9a40-28f06b47739f\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-sd5j6" Apr 16 15:04:09.459746 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.459722 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xd9h7\" (UniqueName: \"kubernetes.io/projected/3c9b6be4-2165-40f4-9a40-28f06b47739f-kube-api-access-xd9h7\") pod \"dns-operator-controller-manager-844548ff4c-sd5j6\" (UID: \"3c9b6be4-2165-40f4-9a40-28f06b47739f\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-sd5j6" Apr 16 15:04:09.472082 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.472053 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd9h7\" (UniqueName: \"kubernetes.io/projected/3c9b6be4-2165-40f4-9a40-28f06b47739f-kube-api-access-xd9h7\") pod \"dns-operator-controller-manager-844548ff4c-sd5j6\" (UID: \"3c9b6be4-2165-40f4-9a40-28f06b47739f\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-sd5j6" Apr 16 15:04:09.569613 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.569583 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-sd5j6" Apr 16 15:04:09.702478 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:09.702456 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-sd5j6"] Apr 16 15:04:09.704594 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:04:09.704566 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c9b6be4_2165_40f4_9a40_28f06b47739f.slice/crio-de72a1f23e42a6f5ae7f6b7d145370804be4bd265374ea7aeb7b4ff63b161d32 WatchSource:0}: Error finding container de72a1f23e42a6f5ae7f6b7d145370804be4bd265374ea7aeb7b4ff63b161d32: Status 404 returned error can't find the container with id de72a1f23e42a6f5ae7f6b7d145370804be4bd265374ea7aeb7b4ff63b161d32 Apr 16 15:04:10.215758 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:10.215723 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-sd5j6" event={"ID":"3c9b6be4-2165-40f4-9a40-28f06b47739f","Type":"ContainerStarted","Data":"de72a1f23e42a6f5ae7f6b7d145370804be4bd265374ea7aeb7b4ff63b161d32"} Apr 16 15:04:12.225835 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:12.225804 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-sd5j6" event={"ID":"3c9b6be4-2165-40f4-9a40-28f06b47739f","Type":"ContainerStarted","Data":"d6d6767d108194ac2f8f335b6d0ed2455fd89f29972f74a777d7aa11dce9c3e8"} Apr 16 15:04:12.226185 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:12.225922 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-sd5j6" Apr 16 15:04:12.241805 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:12.241758 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-sd5j6" podStartSLOduration=0.807037887 podStartE2EDuration="3.241735497s" podCreationTimestamp="2026-04-16 15:04:09 +0000 UTC" firstStartedPulling="2026-04-16 15:04:09.706518169 +0000 UTC m=+688.470470262" lastFinishedPulling="2026-04-16 15:04:12.141215779 +0000 UTC m=+690.905167872" observedRunningTime="2026-04-16 15:04:12.241114575 +0000 UTC m=+691.005066686" watchObservedRunningTime="2026-04-16 15:04:12.241735497 +0000 UTC m=+691.005687608" Apr 16 15:04:12.943840 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:12.943809 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw"] Apr 16 15:04:12.946472 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:12.946452 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw" Apr 16 15:04:12.949642 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:12.949624 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-pqzvr\"" Apr 16 15:04:12.969915 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:12.969326 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw"] Apr 16 15:04:12.990761 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:12.990735 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c25sp\" (UniqueName: \"kubernetes.io/projected/4279f8f7-70f0-46cf-9340-bfe3b343847f-kube-api-access-c25sp\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-j7ztw\" (UID: \"4279f8f7-70f0-46cf-9340-bfe3b343847f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw" Apr 16 15:04:12.990839 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:12.990773 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4279f8f7-70f0-46cf-9340-bfe3b343847f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-j7ztw\" (UID: \"4279f8f7-70f0-46cf-9340-bfe3b343847f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw" Apr 16 15:04:13.091158 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:13.091127 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c25sp\" (UniqueName: \"kubernetes.io/projected/4279f8f7-70f0-46cf-9340-bfe3b343847f-kube-api-access-c25sp\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-j7ztw\" (UID: \"4279f8f7-70f0-46cf-9340-bfe3b343847f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw" Apr 16 15:04:13.091270 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:13.091171 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4279f8f7-70f0-46cf-9340-bfe3b343847f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-j7ztw\" (UID: \"4279f8f7-70f0-46cf-9340-bfe3b343847f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw" Apr 16 15:04:13.091516 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:13.091496 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4279f8f7-70f0-46cf-9340-bfe3b343847f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-j7ztw\" (UID: \"4279f8f7-70f0-46cf-9340-bfe3b343847f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw" Apr 16 15:04:13.100027 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:13.100006 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c25sp\" (UniqueName: \"kubernetes.io/projected/4279f8f7-70f0-46cf-9340-bfe3b343847f-kube-api-access-c25sp\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-j7ztw\" (UID: \"4279f8f7-70f0-46cf-9340-bfe3b343847f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw" Apr 16 15:04:13.256134 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:13.256064 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw" Apr 16 15:04:13.380953 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:04:13.380921 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4279f8f7_70f0_46cf_9340_bfe3b343847f.slice/crio-58565b051aeedf8ad947919eb33260630eadf83c87f8945e6855b776cdc44129 WatchSource:0}: Error finding container 58565b051aeedf8ad947919eb33260630eadf83c87f8945e6855b776cdc44129: Status 404 returned error can't find the container with id 58565b051aeedf8ad947919eb33260630eadf83c87f8945e6855b776cdc44129 Apr 16 15:04:13.381076 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:13.380963 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw"] Apr 16 15:04:14.239564 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:14.239530 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw" event={"ID":"4279f8f7-70f0-46cf-9340-bfe3b343847f","Type":"ContainerStarted","Data":"58565b051aeedf8ad947919eb33260630eadf83c87f8945e6855b776cdc44129"} Apr 16 15:04:18.258966 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:18.258935 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw" event={"ID":"4279f8f7-70f0-46cf-9340-bfe3b343847f","Type":"ContainerStarted","Data":"1b9f61c8a2397e8e18bdc9a7ce45be681756987735e1874487eaa2df3f4f6498"} Apr 16 15:04:18.259358 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:18.259071 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw" Apr 16 15:04:18.282734 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:18.282684 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw" podStartSLOduration=2.473807039 podStartE2EDuration="6.282670462s" podCreationTimestamp="2026-04-16 15:04:12 +0000 UTC" firstStartedPulling="2026-04-16 15:04:13.383516434 +0000 UTC m=+692.147468523" lastFinishedPulling="2026-04-16 15:04:17.19237985 +0000 UTC m=+695.956331946" observedRunningTime="2026-04-16 15:04:18.281342031 +0000 UTC m=+697.045294142" watchObservedRunningTime="2026-04-16 15:04:18.282670462 +0000 UTC m=+697.046622571" Apr 16 15:04:23.235009 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:23.234977 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-sd5j6" Apr 16 15:04:29.265829 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:04:29.265793 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j7ztw" Apr 16 15:05:02.913417 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:02.913345 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-pvt7l"] Apr 16 15:05:02.922611 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:02.922585 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" Apr 16 15:05:02.924929 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:02.924909 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 15:05:02.925228 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:02.925213 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-2vbh9\"" Apr 16 15:05:02.938538 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:02.938518 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-pvt7l"] Apr 16 15:05:03.014047 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:03.014018 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-pvt7l"] Apr 16 15:05:03.090876 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:03.090848 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d27bd8c4-f22c-40a1-a943-f5302429a818-config-file\") pod \"limitador-limitador-64c8f475fb-pvt7l\" (UID: \"d27bd8c4-f22c-40a1-a943-f5302429a818\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" Apr 16 15:05:03.091026 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:03.090997 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5txfp\" (UniqueName: \"kubernetes.io/projected/d27bd8c4-f22c-40a1-a943-f5302429a818-kube-api-access-5txfp\") pod \"limitador-limitador-64c8f475fb-pvt7l\" (UID: \"d27bd8c4-f22c-40a1-a943-f5302429a818\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" Apr 16 15:05:03.191803 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:03.191728 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5txfp\" (UniqueName: \"kubernetes.io/projected/d27bd8c4-f22c-40a1-a943-f5302429a818-kube-api-access-5txfp\") pod \"limitador-limitador-64c8f475fb-pvt7l\" (UID: \"d27bd8c4-f22c-40a1-a943-f5302429a818\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" Apr 16 15:05:03.191803 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:03.191775 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d27bd8c4-f22c-40a1-a943-f5302429a818-config-file\") pod \"limitador-limitador-64c8f475fb-pvt7l\" (UID: \"d27bd8c4-f22c-40a1-a943-f5302429a818\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" Apr 16 15:05:03.192360 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:03.192342 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d27bd8c4-f22c-40a1-a943-f5302429a818-config-file\") pod \"limitador-limitador-64c8f475fb-pvt7l\" (UID: \"d27bd8c4-f22c-40a1-a943-f5302429a818\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" Apr 16 15:05:03.199363 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:03.199344 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5txfp\" (UniqueName: \"kubernetes.io/projected/d27bd8c4-f22c-40a1-a943-f5302429a818-kube-api-access-5txfp\") pod \"limitador-limitador-64c8f475fb-pvt7l\" (UID: \"d27bd8c4-f22c-40a1-a943-f5302429a818\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" Apr 16 15:05:03.232633 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:03.232607 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" Apr 16 15:05:03.353688 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:03.353656 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-pvt7l"] Apr 16 15:05:03.356006 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:05:03.355977 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd27bd8c4_f22c_40a1_a943_f5302429a818.slice/crio-09a86eed046b006613c8bd6cb619f06bc2a7254faa1f24e654c363139231d981 WatchSource:0}: Error finding container 09a86eed046b006613c8bd6cb619f06bc2a7254faa1f24e654c363139231d981: Status 404 returned error can't find the container with id 09a86eed046b006613c8bd6cb619f06bc2a7254faa1f24e654c363139231d981 Apr 16 15:05:03.446522 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:03.446459 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" event={"ID":"d27bd8c4-f22c-40a1-a943-f5302429a818","Type":"ContainerStarted","Data":"09a86eed046b006613c8bd6cb619f06bc2a7254faa1f24e654c363139231d981"} Apr 16 15:05:08.469819 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:08.469784 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" event={"ID":"d27bd8c4-f22c-40a1-a943-f5302429a818","Type":"ContainerStarted","Data":"85ffba767a0d1aa582232e4b746840592fca4016d2b7112fe6ffd87995aa512e"} Apr 16 15:05:08.470250 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:08.469925 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" Apr 16 15:05:08.512738 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:08.512681 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" podStartSLOduration=2.241609884 podStartE2EDuration="6.512666567s" podCreationTimestamp="2026-04-16 15:05:02 +0000 UTC" firstStartedPulling="2026-04-16 15:05:03.357937908 +0000 UTC m=+742.121889998" lastFinishedPulling="2026-04-16 15:05:07.628994584 +0000 UTC m=+746.392946681" observedRunningTime="2026-04-16 15:05:08.510507978 +0000 UTC m=+747.274460088" watchObservedRunningTime="2026-04-16 15:05:08.512666567 +0000 UTC m=+747.276618678" Apr 16 15:05:16.253601 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:16.253560 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-pvt7l"] Apr 16 15:05:16.254094 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:16.253853 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" podUID="d27bd8c4-f22c-40a1-a943-f5302429a818" containerName="limitador" containerID="cri-o://85ffba767a0d1aa582232e4b746840592fca4016d2b7112fe6ffd87995aa512e" gracePeriod=30 Apr 16 15:05:16.254581 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:16.254561 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" Apr 16 15:05:16.798387 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:16.798367 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" Apr 16 15:05:16.900222 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:16.900196 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d27bd8c4-f22c-40a1-a943-f5302429a818-config-file\") pod \"d27bd8c4-f22c-40a1-a943-f5302429a818\" (UID: \"d27bd8c4-f22c-40a1-a943-f5302429a818\") " Apr 16 15:05:16.900338 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:16.900227 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5txfp\" (UniqueName: \"kubernetes.io/projected/d27bd8c4-f22c-40a1-a943-f5302429a818-kube-api-access-5txfp\") pod \"d27bd8c4-f22c-40a1-a943-f5302429a818\" (UID: \"d27bd8c4-f22c-40a1-a943-f5302429a818\") " Apr 16 15:05:16.900530 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:16.900508 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d27bd8c4-f22c-40a1-a943-f5302429a818-config-file" (OuterVolumeSpecName: "config-file") pod "d27bd8c4-f22c-40a1-a943-f5302429a818" (UID: "d27bd8c4-f22c-40a1-a943-f5302429a818"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:05:16.902364 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:16.902334 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d27bd8c4-f22c-40a1-a943-f5302429a818-kube-api-access-5txfp" (OuterVolumeSpecName: "kube-api-access-5txfp") pod "d27bd8c4-f22c-40a1-a943-f5302429a818" (UID: "d27bd8c4-f22c-40a1-a943-f5302429a818"). InnerVolumeSpecName "kube-api-access-5txfp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:05:17.000744 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:17.000719 2568 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d27bd8c4-f22c-40a1-a943-f5302429a818-config-file\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:05:17.000744 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:17.000741 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5txfp\" (UniqueName: \"kubernetes.io/projected/d27bd8c4-f22c-40a1-a943-f5302429a818-kube-api-access-5txfp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:05:17.512064 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:17.512033 2568 generic.go:358] "Generic (PLEG): container finished" podID="d27bd8c4-f22c-40a1-a943-f5302429a818" containerID="85ffba767a0d1aa582232e4b746840592fca4016d2b7112fe6ffd87995aa512e" exitCode=0 Apr 16 15:05:17.512435 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:17.512091 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" event={"ID":"d27bd8c4-f22c-40a1-a943-f5302429a818","Type":"ContainerDied","Data":"85ffba767a0d1aa582232e4b746840592fca4016d2b7112fe6ffd87995aa512e"} Apr 16 15:05:17.512435 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:17.512117 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" event={"ID":"d27bd8c4-f22c-40a1-a943-f5302429a818","Type":"ContainerDied","Data":"09a86eed046b006613c8bd6cb619f06bc2a7254faa1f24e654c363139231d981"} Apr 16 15:05:17.512435 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:17.512121 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-pvt7l" Apr 16 15:05:17.512435 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:17.512132 2568 scope.go:117] "RemoveContainer" containerID="85ffba767a0d1aa582232e4b746840592fca4016d2b7112fe6ffd87995aa512e" Apr 16 15:05:17.521549 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:17.521531 2568 scope.go:117] "RemoveContainer" containerID="85ffba767a0d1aa582232e4b746840592fca4016d2b7112fe6ffd87995aa512e" Apr 16 15:05:17.521909 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:05:17.521863 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ffba767a0d1aa582232e4b746840592fca4016d2b7112fe6ffd87995aa512e\": container with ID starting with 85ffba767a0d1aa582232e4b746840592fca4016d2b7112fe6ffd87995aa512e not found: ID does not exist" containerID="85ffba767a0d1aa582232e4b746840592fca4016d2b7112fe6ffd87995aa512e" Apr 16 15:05:17.522014 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:17.521908 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ffba767a0d1aa582232e4b746840592fca4016d2b7112fe6ffd87995aa512e"} err="failed to get container status \"85ffba767a0d1aa582232e4b746840592fca4016d2b7112fe6ffd87995aa512e\": rpc error: code = NotFound desc = could not find container \"85ffba767a0d1aa582232e4b746840592fca4016d2b7112fe6ffd87995aa512e\": container with ID starting with 85ffba767a0d1aa582232e4b746840592fca4016d2b7112fe6ffd87995aa512e not found: ID does not exist" Apr 16 15:05:17.534733 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:17.534708 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-pvt7l"] Apr 16 15:05:17.539054 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:17.539033 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-pvt7l"] Apr 16 15:05:17.863872 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:17.863849 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d27bd8c4-f22c-40a1-a943-f5302429a818" path="/var/lib/kubelet/pods/d27bd8c4-f22c-40a1-a943-f5302429a818/volumes" Apr 16 15:05:24.760001 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.759970 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-c285d"] Apr 16 15:05:24.760366 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.760351 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d27bd8c4-f22c-40a1-a943-f5302429a818" containerName="limitador" Apr 16 15:05:24.760366 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.760367 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27bd8c4-f22c-40a1-a943-f5302429a818" containerName="limitador" Apr 16 15:05:24.760461 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.760451 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d27bd8c4-f22c-40a1-a943-f5302429a818" containerName="limitador" Apr 16 15:05:24.763719 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.763703 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-c285d" Apr 16 15:05:24.766939 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.766914 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-l9fgz\"" Apr 16 15:05:24.767096 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.766962 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 15:05:24.769996 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.769973 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-c285d"] Apr 16 15:05:24.860188 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.860168 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/52899b2c-7e83-4a5a-ac27-cda93f1e21e2-tls-cert\") pod \"authorino-68bd676465-c285d\" (UID: \"52899b2c-7e83-4a5a-ac27-cda93f1e21e2\") " pod="kuadrant-system/authorino-68bd676465-c285d" Apr 16 15:05:24.860280 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.860211 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67t7z\" (UniqueName: \"kubernetes.io/projected/52899b2c-7e83-4a5a-ac27-cda93f1e21e2-kube-api-access-67t7z\") pod \"authorino-68bd676465-c285d\" (UID: \"52899b2c-7e83-4a5a-ac27-cda93f1e21e2\") " pod="kuadrant-system/authorino-68bd676465-c285d" Apr 16 15:05:24.961292 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.961272 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/52899b2c-7e83-4a5a-ac27-cda93f1e21e2-tls-cert\") pod \"authorino-68bd676465-c285d\" (UID: \"52899b2c-7e83-4a5a-ac27-cda93f1e21e2\") " pod="kuadrant-system/authorino-68bd676465-c285d" Apr 16 15:05:24.961381 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.961341 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67t7z\" (UniqueName: \"kubernetes.io/projected/52899b2c-7e83-4a5a-ac27-cda93f1e21e2-kube-api-access-67t7z\") pod \"authorino-68bd676465-c285d\" (UID: \"52899b2c-7e83-4a5a-ac27-cda93f1e21e2\") " pod="kuadrant-system/authorino-68bd676465-c285d" Apr 16 15:05:24.963700 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.963681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/52899b2c-7e83-4a5a-ac27-cda93f1e21e2-tls-cert\") pod \"authorino-68bd676465-c285d\" (UID: \"52899b2c-7e83-4a5a-ac27-cda93f1e21e2\") " pod="kuadrant-system/authorino-68bd676465-c285d" Apr 16 15:05:24.968198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:24.968179 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67t7z\" (UniqueName: \"kubernetes.io/projected/52899b2c-7e83-4a5a-ac27-cda93f1e21e2-kube-api-access-67t7z\") pod \"authorino-68bd676465-c285d\" (UID: \"52899b2c-7e83-4a5a-ac27-cda93f1e21e2\") " pod="kuadrant-system/authorino-68bd676465-c285d" Apr 16 15:05:25.073730 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:25.073691 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-c285d" Apr 16 15:05:25.195245 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:25.195217 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-c285d"] Apr 16 15:05:25.197263 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:05:25.197237 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52899b2c_7e83_4a5a_ac27_cda93f1e21e2.slice/crio-19909841fa097616198237f00f2cad7baefcc05b228efda3fb24a7e3204123fc WatchSource:0}: Error finding container 19909841fa097616198237f00f2cad7baefcc05b228efda3fb24a7e3204123fc: Status 404 returned error can't find the container with id 19909841fa097616198237f00f2cad7baefcc05b228efda3fb24a7e3204123fc Apr 16 15:05:25.545832 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:25.545770 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-c285d" event={"ID":"52899b2c-7e83-4a5a-ac27-cda93f1e21e2","Type":"ContainerStarted","Data":"19909841fa097616198237f00f2cad7baefcc05b228efda3fb24a7e3204123fc"} Apr 16 15:05:27.554864 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:27.554826 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-c285d" event={"ID":"52899b2c-7e83-4a5a-ac27-cda93f1e21e2","Type":"ContainerStarted","Data":"68f60d7f7bddc06396576e0bc6dae1a3a7606ad2bb9d57e6470fb8e442113ece"} Apr 16 15:05:27.571461 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:27.571413 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-c285d" podStartSLOduration=2.003852635 podStartE2EDuration="3.571397822s" podCreationTimestamp="2026-04-16 15:05:24 +0000 UTC" firstStartedPulling="2026-04-16 15:05:25.198520945 +0000 UTC m=+763.962473036" lastFinishedPulling="2026-04-16 15:05:26.766066129 +0000 UTC m=+765.530018223" observedRunningTime="2026-04-16 15:05:27.56925941 +0000 UTC m=+766.333211521" watchObservedRunningTime="2026-04-16 15:05:27.571397822 +0000 UTC m=+766.335349932" Apr 16 15:05:35.034859 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.034829 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2"] Apr 16 15:05:35.042093 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.042071 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.048255 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.048229 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2"] Apr 16 15:05:35.145094 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.145065 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c9a9c4f3-e124-47b0-8eb1-04932d882266-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.145094 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.145097 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwk88\" (UniqueName: \"kubernetes.io/projected/c9a9c4f3-e124-47b0-8eb1-04932d882266-kube-api-access-lwk88\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.145316 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.145129 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c9a9c4f3-e124-47b0-8eb1-04932d882266-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.145316 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.145164 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c9a9c4f3-e124-47b0-8eb1-04932d882266-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.145316 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.145295 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c9a9c4f3-e124-47b0-8eb1-04932d882266-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.145429 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.145362 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c9a9c4f3-e124-47b0-8eb1-04932d882266-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.145429 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.145401 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c9a9c4f3-e124-47b0-8eb1-04932d882266-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.246003 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.245976 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c9a9c4f3-e124-47b0-8eb1-04932d882266-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.246003 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.246006 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwk88\" (UniqueName: \"kubernetes.io/projected/c9a9c4f3-e124-47b0-8eb1-04932d882266-kube-api-access-lwk88\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.246172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.246030 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c9a9c4f3-e124-47b0-8eb1-04932d882266-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.246172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.246058 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c9a9c4f3-e124-47b0-8eb1-04932d882266-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.246172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.246097 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c9a9c4f3-e124-47b0-8eb1-04932d882266-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.246172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.246134 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c9a9c4f3-e124-47b0-8eb1-04932d882266-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.246172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.246169 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c9a9c4f3-e124-47b0-8eb1-04932d882266-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.247032 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.247008 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c9a9c4f3-e124-47b0-8eb1-04932d882266-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.248586 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.248551 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c9a9c4f3-e124-47b0-8eb1-04932d882266-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.248586 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.248564 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c9a9c4f3-e124-47b0-8eb1-04932d882266-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.248748 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.248686 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c9a9c4f3-e124-47b0-8eb1-04932d882266-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.248848 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.248828 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c9a9c4f3-e124-47b0-8eb1-04932d882266-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.260335 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.260314 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwk88\" (UniqueName: \"kubernetes.io/projected/c9a9c4f3-e124-47b0-8eb1-04932d882266-kube-api-access-lwk88\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.260642 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.260620 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c9a9c4f3-e124-47b0-8eb1-04932d882266-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-42vf2\" (UID: \"c9a9c4f3-e124-47b0-8eb1-04932d882266\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.352340 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.352315 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.485086 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.485063 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2"] Apr 16 15:05:35.487043 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:05:35.487015 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a9c4f3_e124_47b0_8eb1_04932d882266.slice/crio-ef3fe7c3c02120376e2ebed349b3d0bd9eb55d8521d2f21ee912ba95dce04fc7 WatchSource:0}: Error finding container ef3fe7c3c02120376e2ebed349b3d0bd9eb55d8521d2f21ee912ba95dce04fc7: Status 404 returned error can't find the container with id ef3fe7c3c02120376e2ebed349b3d0bd9eb55d8521d2f21ee912ba95dce04fc7 Apr 16 15:05:35.489073 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.489039 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:05:35.489145 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.489105 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:05:35.590634 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.590604 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" event={"ID":"c9a9c4f3-e124-47b0-8eb1-04932d882266","Type":"ContainerStarted","Data":"4aeff11b4a433240faac3716d5ef30b5288c6c946e53e4329cadff1ad1902114"} Apr 16 15:05:35.590792 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.590641 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" event={"ID":"c9a9c4f3-e124-47b0-8eb1-04932d882266","Type":"ContainerStarted","Data":"ef3fe7c3c02120376e2ebed349b3d0bd9eb55d8521d2f21ee912ba95dce04fc7"} Apr 16 15:05:35.590792 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.590657 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:35.614046 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:35.613943 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" podStartSLOduration=0.613923413 podStartE2EDuration="613.923413ms" podCreationTimestamp="2026-04-16 15:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:05:35.610361864 +0000 UTC m=+774.374313975" watchObservedRunningTime="2026-04-16 15:05:35.613923413 +0000 UTC m=+774.377875527" Apr 16 15:05:36.597252 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:36.597215 2568 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-42vf2 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 15:05:36.597667 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:36.597277 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" podUID="c9a9c4f3-e124-47b0-8eb1-04932d882266" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:05:39.596594 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.596571 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42vf2" Apr 16 15:05:39.654591 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.654564 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm"] Apr 16 15:05:39.654828 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.654806 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" podUID="ec39bede-46a4-420f-88a8-861d4ce1d289" containerName="discovery" containerID="cri-o://3416e3e3d8b6f6d43459dadcb286b25d156b10147ea3ddb7d3fec6d0542950f0" gracePeriod=30 Apr 16 15:05:39.927026 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.927003 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:05:39.988667 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.988637 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-csr-dns-cert\") pod \"ec39bede-46a4-420f-88a8-861d4ce1d289\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " Apr 16 15:05:39.988822 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.988739 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-token\") pod \"ec39bede-46a4-420f-88a8-861d4ce1d289\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " Apr 16 15:05:39.988822 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.988779 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-cacerts\") pod \"ec39bede-46a4-420f-88a8-861d4ce1d289\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " Apr 16 15:05:39.988975 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.988822 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-csr-ca-configmap\") pod \"ec39bede-46a4-420f-88a8-861d4ce1d289\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " Apr 16 15:05:39.988975 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.988941 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/ec39bede-46a4-420f-88a8-861d4ce1d289-local-certs\") pod \"ec39bede-46a4-420f-88a8-861d4ce1d289\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " Apr 16 15:05:39.988975 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.988972 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-kubeconfig\") pod \"ec39bede-46a4-420f-88a8-861d4ce1d289\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " Apr 16 15:05:39.989133 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.989001 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtjp8\" (UniqueName: \"kubernetes.io/projected/ec39bede-46a4-420f-88a8-861d4ce1d289-kube-api-access-vtjp8\") pod \"ec39bede-46a4-420f-88a8-861d4ce1d289\" (UID: \"ec39bede-46a4-420f-88a8-861d4ce1d289\") " Apr 16 15:05:39.989640 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.989243 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "ec39bede-46a4-420f-88a8-861d4ce1d289" (UID: "ec39bede-46a4-420f-88a8-861d4ce1d289"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:05:39.989640 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.989386 2568 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-csr-ca-configmap\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:05:39.991468 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.991439 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-token" (OuterVolumeSpecName: "istio-token") pod "ec39bede-46a4-420f-88a8-861d4ce1d289" (UID: "ec39bede-46a4-420f-88a8-861d4ce1d289"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:05:39.991729 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.991680 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-cacerts" (OuterVolumeSpecName: "cacerts") pod "ec39bede-46a4-420f-88a8-861d4ce1d289" (UID: "ec39bede-46a4-420f-88a8-861d4ce1d289"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:05:39.991825 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.991761 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "ec39bede-46a4-420f-88a8-861d4ce1d289" (UID: "ec39bede-46a4-420f-88a8-861d4ce1d289"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:05:39.992069 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.992048 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec39bede-46a4-420f-88a8-861d4ce1d289-local-certs" (OuterVolumeSpecName: "local-certs") pod "ec39bede-46a4-420f-88a8-861d4ce1d289" (UID: "ec39bede-46a4-420f-88a8-861d4ce1d289"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:05:39.992189 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.992153 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec39bede-46a4-420f-88a8-861d4ce1d289-kube-api-access-vtjp8" (OuterVolumeSpecName: "kube-api-access-vtjp8") pod "ec39bede-46a4-420f-88a8-861d4ce1d289" (UID: "ec39bede-46a4-420f-88a8-861d4ce1d289"). InnerVolumeSpecName "kube-api-access-vtjp8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:05:39.992795 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:39.992776 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "ec39bede-46a4-420f-88a8-861d4ce1d289" (UID: "ec39bede-46a4-420f-88a8-861d4ce1d289"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:05:40.090126 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.090095 2568 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/ec39bede-46a4-420f-88a8-861d4ce1d289-local-certs\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:05:40.090126 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.090121 2568 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-kubeconfig\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:05:40.090284 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.090135 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vtjp8\" (UniqueName: \"kubernetes.io/projected/ec39bede-46a4-420f-88a8-861d4ce1d289-kube-api-access-vtjp8\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:05:40.090284 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.090148 2568 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-csr-dns-cert\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:05:40.090284 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.090161 2568 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ec39bede-46a4-420f-88a8-861d4ce1d289-istio-token\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:05:40.090284 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.090173 2568 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/ec39bede-46a4-420f-88a8-861d4ce1d289-cacerts\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:05:40.610633 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.610598 2568 generic.go:358] "Generic (PLEG): container finished" podID="ec39bede-46a4-420f-88a8-861d4ce1d289" containerID="3416e3e3d8b6f6d43459dadcb286b25d156b10147ea3ddb7d3fec6d0542950f0" exitCode=0 Apr 16 15:05:40.611081 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.610641 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" event={"ID":"ec39bede-46a4-420f-88a8-861d4ce1d289","Type":"ContainerDied","Data":"3416e3e3d8b6f6d43459dadcb286b25d156b10147ea3ddb7d3fec6d0542950f0"} Apr 16 15:05:40.611081 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.610656 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" Apr 16 15:05:40.611081 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.610675 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm" event={"ID":"ec39bede-46a4-420f-88a8-861d4ce1d289","Type":"ContainerDied","Data":"bcc77456c687bd22ceab202f99b05db94a6f2babf71c4b5c2c4b4a2ba429f412"} Apr 16 15:05:40.611081 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.610695 2568 scope.go:117] "RemoveContainer" containerID="3416e3e3d8b6f6d43459dadcb286b25d156b10147ea3ddb7d3fec6d0542950f0" Apr 16 15:05:40.620078 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.620060 2568 scope.go:117] "RemoveContainer" containerID="3416e3e3d8b6f6d43459dadcb286b25d156b10147ea3ddb7d3fec6d0542950f0" Apr 16 15:05:40.620334 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:05:40.620316 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3416e3e3d8b6f6d43459dadcb286b25d156b10147ea3ddb7d3fec6d0542950f0\": container with ID starting with 3416e3e3d8b6f6d43459dadcb286b25d156b10147ea3ddb7d3fec6d0542950f0 not found: ID does not exist" containerID="3416e3e3d8b6f6d43459dadcb286b25d156b10147ea3ddb7d3fec6d0542950f0" Apr 16 15:05:40.620393 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.620347 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3416e3e3d8b6f6d43459dadcb286b25d156b10147ea3ddb7d3fec6d0542950f0"} err="failed to get container status \"3416e3e3d8b6f6d43459dadcb286b25d156b10147ea3ddb7d3fec6d0542950f0\": rpc error: code = NotFound desc = could not find container \"3416e3e3d8b6f6d43459dadcb286b25d156b10147ea3ddb7d3fec6d0542950f0\": container with ID starting with 3416e3e3d8b6f6d43459dadcb286b25d156b10147ea3ddb7d3fec6d0542950f0 not found: ID does not exist" Apr 16 15:05:40.633610 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.633586 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm"] Apr 16 15:05:40.637029 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:40.637006 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lxfm"] Apr 16 15:05:41.864109 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:41.864080 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec39bede-46a4-420f-88a8-861d4ce1d289" path="/var/lib/kubelet/pods/ec39bede-46a4-420f-88a8-861d4ce1d289/volumes" Apr 16 15:05:43.197410 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.197377 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-22gz8"] Apr 16 15:05:43.198046 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.198026 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec39bede-46a4-420f-88a8-861d4ce1d289" containerName="discovery" Apr 16 15:05:43.198046 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.198048 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec39bede-46a4-420f-88a8-861d4ce1d289" containerName="discovery" Apr 16 15:05:43.198223 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.198161 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec39bede-46a4-420f-88a8-861d4ce1d289" containerName="discovery" Apr 16 15:05:43.207102 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.207080 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-22gz8" Apr 16 15:05:43.208712 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.208688 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-22gz8"] Apr 16 15:05:43.211374 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.211354 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:05:43.211484 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.211375 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-wsmw6\"" Apr 16 15:05:43.211651 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.211634 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:05:43.211735 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.211701 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 15:05:43.211931 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.211910 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-79498b55f8-rdp2j"] Apr 16 15:05:43.216451 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.216415 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" Apr 16 15:05:43.219521 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.219504 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 15:05:43.221267 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.220826 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-9txrd\"" Apr 16 15:05:43.223327 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.223232 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-79498b55f8-rdp2j"] Apr 16 15:05:43.244492 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.244456 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-xprxk"] Apr 16 15:05:43.251256 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.251225 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-xprxk" Apr 16 15:05:43.254327 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.254293 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 15:05:43.254475 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.254445 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-c65rw\"" Apr 16 15:05:43.257379 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.257361 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-xprxk"] Apr 16 15:05:43.320524 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.320494 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vrxk\" (UniqueName: \"kubernetes.io/projected/8b950dd9-f975-496f-a8b7-00ad2c5c54b2-kube-api-access-6vrxk\") pod \"llmisvc-controller-manager-79498b55f8-rdp2j\" (UID: \"8b950dd9-f975-496f-a8b7-00ad2c5c54b2\") " pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" Apr 16 15:05:43.320677 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.320574 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b950dd9-f975-496f-a8b7-00ad2c5c54b2-cert\") pod \"llmisvc-controller-manager-79498b55f8-rdp2j\" (UID: \"8b950dd9-f975-496f-a8b7-00ad2c5c54b2\") " pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" Apr 16 15:05:43.320677 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.320606 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/74ac6b54-28a0-4446-a105-8ef0e061898c-data\") pod \"seaweedfs-86cc847c5c-xprxk\" (UID: \"74ac6b54-28a0-4446-a105-8ef0e061898c\") " pod="kserve/seaweedfs-86cc847c5c-xprxk" Apr 16 15:05:43.320677 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.320621 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5888e974-f659-4ff2-b28e-e0f86d3cbfff-cert\") pod \"kserve-controller-manager-7669bdc57-22gz8\" (UID: \"5888e974-f659-4ff2-b28e-e0f86d3cbfff\") " pod="kserve/kserve-controller-manager-7669bdc57-22gz8" Apr 16 15:05:43.320677 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.320663 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj9lj\" (UniqueName: \"kubernetes.io/projected/74ac6b54-28a0-4446-a105-8ef0e061898c-kube-api-access-lj9lj\") pod \"seaweedfs-86cc847c5c-xprxk\" (UID: \"74ac6b54-28a0-4446-a105-8ef0e061898c\") " pod="kserve/seaweedfs-86cc847c5c-xprxk" Apr 16 15:05:43.320873 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.320715 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrdkl\" (UniqueName: \"kubernetes.io/projected/5888e974-f659-4ff2-b28e-e0f86d3cbfff-kube-api-access-mrdkl\") pod \"kserve-controller-manager-7669bdc57-22gz8\" (UID: \"5888e974-f659-4ff2-b28e-e0f86d3cbfff\") " pod="kserve/kserve-controller-manager-7669bdc57-22gz8" Apr 16 15:05:43.421344 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.421316 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrdkl\" (UniqueName: \"kubernetes.io/projected/5888e974-f659-4ff2-b28e-e0f86d3cbfff-kube-api-access-mrdkl\") pod \"kserve-controller-manager-7669bdc57-22gz8\" (UID: \"5888e974-f659-4ff2-b28e-e0f86d3cbfff\") " pod="kserve/kserve-controller-manager-7669bdc57-22gz8" Apr 16 15:05:43.421501 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.421363 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vrxk\" (UniqueName: \"kubernetes.io/projected/8b950dd9-f975-496f-a8b7-00ad2c5c54b2-kube-api-access-6vrxk\") pod \"llmisvc-controller-manager-79498b55f8-rdp2j\" (UID: \"8b950dd9-f975-496f-a8b7-00ad2c5c54b2\") " pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" Apr 16 15:05:43.421501 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.421419 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b950dd9-f975-496f-a8b7-00ad2c5c54b2-cert\") pod \"llmisvc-controller-manager-79498b55f8-rdp2j\" (UID: \"8b950dd9-f975-496f-a8b7-00ad2c5c54b2\") " pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" Apr 16 15:05:43.421501 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.421445 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/74ac6b54-28a0-4446-a105-8ef0e061898c-data\") pod \"seaweedfs-86cc847c5c-xprxk\" (UID: \"74ac6b54-28a0-4446-a105-8ef0e061898c\") " pod="kserve/seaweedfs-86cc847c5c-xprxk" Apr 16 15:05:43.421501 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.421461 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5888e974-f659-4ff2-b28e-e0f86d3cbfff-cert\") pod \"kserve-controller-manager-7669bdc57-22gz8\" (UID: \"5888e974-f659-4ff2-b28e-e0f86d3cbfff\") " pod="kserve/kserve-controller-manager-7669bdc57-22gz8" Apr 16 15:05:43.421501 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.421491 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lj9lj\" (UniqueName: \"kubernetes.io/projected/74ac6b54-28a0-4446-a105-8ef0e061898c-kube-api-access-lj9lj\") pod \"seaweedfs-86cc847c5c-xprxk\" (UID: \"74ac6b54-28a0-4446-a105-8ef0e061898c\") " pod="kserve/seaweedfs-86cc847c5c-xprxk" Apr 16 15:05:43.421916 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.421868 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/74ac6b54-28a0-4446-a105-8ef0e061898c-data\") pod \"seaweedfs-86cc847c5c-xprxk\" (UID: \"74ac6b54-28a0-4446-a105-8ef0e061898c\") " pod="kserve/seaweedfs-86cc847c5c-xprxk" Apr 16 15:05:43.423823 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.423801 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5888e974-f659-4ff2-b28e-e0f86d3cbfff-cert\") pod \"kserve-controller-manager-7669bdc57-22gz8\" (UID: \"5888e974-f659-4ff2-b28e-e0f86d3cbfff\") " pod="kserve/kserve-controller-manager-7669bdc57-22gz8" Apr 16 15:05:43.424151 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.424132 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b950dd9-f975-496f-a8b7-00ad2c5c54b2-cert\") pod \"llmisvc-controller-manager-79498b55f8-rdp2j\" (UID: \"8b950dd9-f975-496f-a8b7-00ad2c5c54b2\") " pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" Apr 16 15:05:43.429546 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.429522 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vrxk\" (UniqueName: \"kubernetes.io/projected/8b950dd9-f975-496f-a8b7-00ad2c5c54b2-kube-api-access-6vrxk\") pod \"llmisvc-controller-manager-79498b55f8-rdp2j\" (UID: \"8b950dd9-f975-496f-a8b7-00ad2c5c54b2\") " pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" Apr 16 15:05:43.429930 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.429907 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrdkl\" (UniqueName: \"kubernetes.io/projected/5888e974-f659-4ff2-b28e-e0f86d3cbfff-kube-api-access-mrdkl\") pod \"kserve-controller-manager-7669bdc57-22gz8\" (UID: \"5888e974-f659-4ff2-b28e-e0f86d3cbfff\") " pod="kserve/kserve-controller-manager-7669bdc57-22gz8" Apr 16 15:05:43.430113 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.430094 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj9lj\" (UniqueName: \"kubernetes.io/projected/74ac6b54-28a0-4446-a105-8ef0e061898c-kube-api-access-lj9lj\") pod \"seaweedfs-86cc847c5c-xprxk\" (UID: \"74ac6b54-28a0-4446-a105-8ef0e061898c\") " pod="kserve/seaweedfs-86cc847c5c-xprxk" Apr 16 15:05:43.520582 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.520509 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-22gz8" Apr 16 15:05:43.531369 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.531352 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" Apr 16 15:05:43.562744 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.562711 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-xprxk" Apr 16 15:05:43.683596 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:05:43.683553 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5888e974_f659_4ff2_b28e_e0f86d3cbfff.slice/crio-101575df8dc53b130ee4326097b816c4f9fb987062c7136ca67bf27c02614488 WatchSource:0}: Error finding container 101575df8dc53b130ee4326097b816c4f9fb987062c7136ca67bf27c02614488: Status 404 returned error can't find the container with id 101575df8dc53b130ee4326097b816c4f9fb987062c7136ca67bf27c02614488 Apr 16 15:05:43.683799 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.683775 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-22gz8"] Apr 16 15:05:43.708009 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.707983 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-79498b55f8-rdp2j"] Apr 16 15:05:43.708511 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:05:43.708482 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8b950dd9_f975_496f_a8b7_00ad2c5c54b2.slice/crio-362553bd14b06d93b8195830690169dcfb3a4325e76276552d2470ca965fc188 WatchSource:0}: Error finding container 362553bd14b06d93b8195830690169dcfb3a4325e76276552d2470ca965fc188: Status 404 returned error can't find the container with id 362553bd14b06d93b8195830690169dcfb3a4325e76276552d2470ca965fc188 Apr 16 15:05:43.728796 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:43.728774 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-xprxk"] Apr 16 15:05:43.729605 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:05:43.729587 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74ac6b54_28a0_4446_a105_8ef0e061898c.slice/crio-869fea90a98dbbf38dbcd0b47dc014bfc50303321a308479d5ce18c51eaece3d WatchSource:0}: Error finding container 869fea90a98dbbf38dbcd0b47dc014bfc50303321a308479d5ce18c51eaece3d: Status 404 returned error can't find the container with id 869fea90a98dbbf38dbcd0b47dc014bfc50303321a308479d5ce18c51eaece3d Apr 16 15:05:44.635320 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:44.635276 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-22gz8" event={"ID":"5888e974-f659-4ff2-b28e-e0f86d3cbfff","Type":"ContainerStarted","Data":"101575df8dc53b130ee4326097b816c4f9fb987062c7136ca67bf27c02614488"} Apr 16 15:05:44.637649 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:44.637620 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-xprxk" event={"ID":"74ac6b54-28a0-4446-a105-8ef0e061898c","Type":"ContainerStarted","Data":"869fea90a98dbbf38dbcd0b47dc014bfc50303321a308479d5ce18c51eaece3d"} Apr 16 15:05:44.639344 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:44.639276 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" event={"ID":"8b950dd9-f975-496f-a8b7-00ad2c5c54b2","Type":"ContainerStarted","Data":"362553bd14b06d93b8195830690169dcfb3a4325e76276552d2470ca965fc188"} Apr 16 15:05:48.664643 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:48.664605 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-xprxk" event={"ID":"74ac6b54-28a0-4446-a105-8ef0e061898c","Type":"ContainerStarted","Data":"7f89fac8322018be2b1b767757bf8678ce48d5abb1e299a607469b39ed163ab7"} Apr 16 15:05:48.665068 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:48.664702 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-xprxk" Apr 16 15:05:48.666078 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:48.666056 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" event={"ID":"8b950dd9-f975-496f-a8b7-00ad2c5c54b2","Type":"ContainerStarted","Data":"e1e2de648d9c4918e56d85d9f06ded0ac88a47e09efc753e27c4943c6da8a8c9"} Apr 16 15:05:48.666192 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:48.666158 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" Apr 16 15:05:48.667372 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:48.667353 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-22gz8" event={"ID":"5888e974-f659-4ff2-b28e-e0f86d3cbfff","Type":"ContainerStarted","Data":"4161ec662ab7c7893120888b21dde955b9a9b1554dbc2fee0eab99387830e8e9"} Apr 16 15:05:48.667468 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:48.667457 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7669bdc57-22gz8" Apr 16 15:05:48.681666 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:48.681624 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-xprxk" podStartSLOduration=1.105207813 podStartE2EDuration="5.681613484s" podCreationTimestamp="2026-04-16 15:05:43 +0000 UTC" firstStartedPulling="2026-04-16 15:05:43.730850668 +0000 UTC m=+782.494802760" lastFinishedPulling="2026-04-16 15:05:48.307256338 +0000 UTC m=+787.071208431" observedRunningTime="2026-04-16 15:05:48.678496074 +0000 UTC m=+787.442448184" watchObservedRunningTime="2026-04-16 15:05:48.681613484 +0000 UTC m=+787.445565594" Apr 16 15:05:48.695284 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:48.695236 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7669bdc57-22gz8" podStartSLOduration=1.211918833 podStartE2EDuration="5.695226629s" podCreationTimestamp="2026-04-16 15:05:43 +0000 UTC" firstStartedPulling="2026-04-16 15:05:43.685313004 +0000 UTC m=+782.449265098" lastFinishedPulling="2026-04-16 15:05:48.168620789 +0000 UTC m=+786.932572894" observedRunningTime="2026-04-16 15:05:48.692756885 +0000 UTC m=+787.456708997" watchObservedRunningTime="2026-04-16 15:05:48.695226629 +0000 UTC m=+787.459178741" Apr 16 15:05:48.708384 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:48.708341 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" podStartSLOduration=1.170729836 podStartE2EDuration="5.708327397s" podCreationTimestamp="2026-04-16 15:05:43 +0000 UTC" firstStartedPulling="2026-04-16 15:05:43.710231252 +0000 UTC m=+782.474183340" lastFinishedPulling="2026-04-16 15:05:48.24782879 +0000 UTC m=+787.011780901" observedRunningTime="2026-04-16 15:05:48.707682506 +0000 UTC m=+787.471634618" watchObservedRunningTime="2026-04-16 15:05:48.708327397 +0000 UTC m=+787.472279513" Apr 16 15:05:54.674058 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:05:54.674021 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-xprxk" Apr 16 15:06:19.673577 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:19.673545 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" Apr 16 15:06:19.676741 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:19.676715 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7669bdc57-22gz8" Apr 16 15:06:20.904954 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:20.904926 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-22gz8"] Apr 16 15:06:20.905339 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:20.905123 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7669bdc57-22gz8" podUID="5888e974-f659-4ff2-b28e-e0f86d3cbfff" containerName="manager" containerID="cri-o://4161ec662ab7c7893120888b21dde955b9a9b1554dbc2fee0eab99387830e8e9" gracePeriod=10 Apr 16 15:06:20.924423 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:20.924392 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-q9lnq"] Apr 16 15:06:20.928062 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:20.928048 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-q9lnq" Apr 16 15:06:20.936388 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:20.936369 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-q9lnq"] Apr 16 15:06:21.015446 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.015420 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjcwv\" (UniqueName: \"kubernetes.io/projected/ee07ca6c-8ecb-4821-ba05-70958800051a-kube-api-access-vjcwv\") pod \"kserve-controller-manager-7669bdc57-q9lnq\" (UID: \"ee07ca6c-8ecb-4821-ba05-70958800051a\") " pod="kserve/kserve-controller-manager-7669bdc57-q9lnq" Apr 16 15:06:21.015586 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.015566 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee07ca6c-8ecb-4821-ba05-70958800051a-cert\") pod \"kserve-controller-manager-7669bdc57-q9lnq\" (UID: \"ee07ca6c-8ecb-4821-ba05-70958800051a\") " pod="kserve/kserve-controller-manager-7669bdc57-q9lnq" Apr 16 15:06:21.116258 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.116234 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee07ca6c-8ecb-4821-ba05-70958800051a-cert\") pod \"kserve-controller-manager-7669bdc57-q9lnq\" (UID: \"ee07ca6c-8ecb-4821-ba05-70958800051a\") " pod="kserve/kserve-controller-manager-7669bdc57-q9lnq" Apr 16 15:06:21.116377 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.116292 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjcwv\" (UniqueName: \"kubernetes.io/projected/ee07ca6c-8ecb-4821-ba05-70958800051a-kube-api-access-vjcwv\") pod \"kserve-controller-manager-7669bdc57-q9lnq\" (UID: \"ee07ca6c-8ecb-4821-ba05-70958800051a\") " pod="kserve/kserve-controller-manager-7669bdc57-q9lnq" Apr 16 15:06:21.118658 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.118635 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee07ca6c-8ecb-4821-ba05-70958800051a-cert\") pod \"kserve-controller-manager-7669bdc57-q9lnq\" (UID: \"ee07ca6c-8ecb-4821-ba05-70958800051a\") " pod="kserve/kserve-controller-manager-7669bdc57-q9lnq" Apr 16 15:06:21.123489 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.123464 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjcwv\" (UniqueName: \"kubernetes.io/projected/ee07ca6c-8ecb-4821-ba05-70958800051a-kube-api-access-vjcwv\") pod \"kserve-controller-manager-7669bdc57-q9lnq\" (UID: \"ee07ca6c-8ecb-4821-ba05-70958800051a\") " pod="kserve/kserve-controller-manager-7669bdc57-q9lnq" Apr 16 15:06:21.149653 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.149636 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-22gz8" Apr 16 15:06:21.216739 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.216667 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrdkl\" (UniqueName: \"kubernetes.io/projected/5888e974-f659-4ff2-b28e-e0f86d3cbfff-kube-api-access-mrdkl\") pod \"5888e974-f659-4ff2-b28e-e0f86d3cbfff\" (UID: \"5888e974-f659-4ff2-b28e-e0f86d3cbfff\") " Apr 16 15:06:21.216866 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.216753 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5888e974-f659-4ff2-b28e-e0f86d3cbfff-cert\") pod \"5888e974-f659-4ff2-b28e-e0f86d3cbfff\" (UID: \"5888e974-f659-4ff2-b28e-e0f86d3cbfff\") " Apr 16 15:06:21.218741 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.218716 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5888e974-f659-4ff2-b28e-e0f86d3cbfff-cert" (OuterVolumeSpecName: "cert") pod "5888e974-f659-4ff2-b28e-e0f86d3cbfff" (UID: "5888e974-f659-4ff2-b28e-e0f86d3cbfff"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:06:21.218819 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.218762 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5888e974-f659-4ff2-b28e-e0f86d3cbfff-kube-api-access-mrdkl" (OuterVolumeSpecName: "kube-api-access-mrdkl") pod "5888e974-f659-4ff2-b28e-e0f86d3cbfff" (UID: "5888e974-f659-4ff2-b28e-e0f86d3cbfff"). InnerVolumeSpecName "kube-api-access-mrdkl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:06:21.288209 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.288185 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-q9lnq" Apr 16 15:06:21.317752 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.317731 2568 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5888e974-f659-4ff2-b28e-e0f86d3cbfff-cert\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:06:21.317850 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.317754 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrdkl\" (UniqueName: \"kubernetes.io/projected/5888e974-f659-4ff2-b28e-e0f86d3cbfff-kube-api-access-mrdkl\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:06:21.630053 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.630026 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-q9lnq"] Apr 16 15:06:21.631876 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:06:21.631855 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee07ca6c_8ecb_4821_ba05_70958800051a.slice/crio-fcf0ee691f4a909d50401caf3fab410212ae07ce6c7c2169bb2a5885f63cfa59 WatchSource:0}: Error finding container fcf0ee691f4a909d50401caf3fab410212ae07ce6c7c2169bb2a5885f63cfa59: Status 404 returned error can't find the container with id fcf0ee691f4a909d50401caf3fab410212ae07ce6c7c2169bb2a5885f63cfa59 Apr 16 15:06:21.809311 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.809284 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-q9lnq" event={"ID":"ee07ca6c-8ecb-4821-ba05-70958800051a","Type":"ContainerStarted","Data":"fcf0ee691f4a909d50401caf3fab410212ae07ce6c7c2169bb2a5885f63cfa59"} Apr 16 15:06:21.810348 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.810326 2568 generic.go:358] "Generic (PLEG): container finished" podID="5888e974-f659-4ff2-b28e-e0f86d3cbfff" containerID="4161ec662ab7c7893120888b21dde955b9a9b1554dbc2fee0eab99387830e8e9" exitCode=0 Apr 16 15:06:21.810443 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.810371 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-22gz8" event={"ID":"5888e974-f659-4ff2-b28e-e0f86d3cbfff","Type":"ContainerDied","Data":"4161ec662ab7c7893120888b21dde955b9a9b1554dbc2fee0eab99387830e8e9"} Apr 16 15:06:21.810443 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.810383 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-22gz8" Apr 16 15:06:21.810443 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.810391 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-22gz8" event={"ID":"5888e974-f659-4ff2-b28e-e0f86d3cbfff","Type":"ContainerDied","Data":"101575df8dc53b130ee4326097b816c4f9fb987062c7136ca67bf27c02614488"} Apr 16 15:06:21.810443 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.810406 2568 scope.go:117] "RemoveContainer" containerID="4161ec662ab7c7893120888b21dde955b9a9b1554dbc2fee0eab99387830e8e9" Apr 16 15:06:21.819438 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.819418 2568 scope.go:117] "RemoveContainer" containerID="4161ec662ab7c7893120888b21dde955b9a9b1554dbc2fee0eab99387830e8e9" Apr 16 15:06:21.819684 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:06:21.819666 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4161ec662ab7c7893120888b21dde955b9a9b1554dbc2fee0eab99387830e8e9\": container with ID starting with 4161ec662ab7c7893120888b21dde955b9a9b1554dbc2fee0eab99387830e8e9 not found: ID does not exist" containerID="4161ec662ab7c7893120888b21dde955b9a9b1554dbc2fee0eab99387830e8e9" Apr 16 15:06:21.819739 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.819693 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4161ec662ab7c7893120888b21dde955b9a9b1554dbc2fee0eab99387830e8e9"} err="failed to get container status \"4161ec662ab7c7893120888b21dde955b9a9b1554dbc2fee0eab99387830e8e9\": rpc error: code = NotFound desc = could not find container \"4161ec662ab7c7893120888b21dde955b9a9b1554dbc2fee0eab99387830e8e9\": container with ID starting with 4161ec662ab7c7893120888b21dde955b9a9b1554dbc2fee0eab99387830e8e9 not found: ID does not exist" Apr 16 15:06:21.830401 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.830379 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-22gz8"] Apr 16 15:06:21.834275 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.834248 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-22gz8"] Apr 16 15:06:21.863641 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:21.863618 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5888e974-f659-4ff2-b28e-e0f86d3cbfff" path="/var/lib/kubelet/pods/5888e974-f659-4ff2-b28e-e0f86d3cbfff/volumes" Apr 16 15:06:22.818925 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:22.818865 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-q9lnq" event={"ID":"ee07ca6c-8ecb-4821-ba05-70958800051a","Type":"ContainerStarted","Data":"fa1838cc3e7946edd0cba4c89289e185bfc85b019292a90af2ac7cd01984f5f6"} Apr 16 15:06:22.819402 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:22.819016 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7669bdc57-q9lnq" Apr 16 15:06:22.835562 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:22.835511 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7669bdc57-q9lnq" podStartSLOduration=2.494009477 podStartE2EDuration="2.835499557s" podCreationTimestamp="2026-04-16 15:06:20 +0000 UTC" firstStartedPulling="2026-04-16 15:06:21.633194044 +0000 UTC m=+820.397146133" lastFinishedPulling="2026-04-16 15:06:21.974684108 +0000 UTC m=+820.738636213" observedRunningTime="2026-04-16 15:06:22.834339085 +0000 UTC m=+821.598291198" watchObservedRunningTime="2026-04-16 15:06:22.835499557 +0000 UTC m=+821.599451667" Apr 16 15:06:53.829030 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:53.828998 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7669bdc57-q9lnq" Apr 16 15:06:54.707034 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:54.707001 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-vm4rk"] Apr 16 15:06:54.707411 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:54.707398 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5888e974-f659-4ff2-b28e-e0f86d3cbfff" containerName="manager" Apr 16 15:06:54.707471 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:54.707411 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5888e974-f659-4ff2-b28e-e0f86d3cbfff" containerName="manager" Apr 16 15:06:54.707510 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:54.707480 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5888e974-f659-4ff2-b28e-e0f86d3cbfff" containerName="manager" Apr 16 15:06:54.709968 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:54.709948 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-vm4rk" Apr 16 15:06:54.715908 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:54.712780 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 15:06:54.715908 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:54.712931 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-7kdbv\"" Apr 16 15:06:54.723600 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:54.723578 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-vm4rk"] Apr 16 15:06:54.779184 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:54.779162 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mf58\" (UniqueName: \"kubernetes.io/projected/0bc23e19-9195-4eb3-bed2-e4d34efec7ed-kube-api-access-5mf58\") pod \"model-serving-api-86f7b4b499-vm4rk\" (UID: \"0bc23e19-9195-4eb3-bed2-e4d34efec7ed\") " pod="kserve/model-serving-api-86f7b4b499-vm4rk" Apr 16 15:06:54.779306 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:54.779223 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc23e19-9195-4eb3-bed2-e4d34efec7ed-tls-certs\") pod \"model-serving-api-86f7b4b499-vm4rk\" (UID: \"0bc23e19-9195-4eb3-bed2-e4d34efec7ed\") " pod="kserve/model-serving-api-86f7b4b499-vm4rk" Apr 16 15:06:54.879616 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:54.879594 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mf58\" (UniqueName: \"kubernetes.io/projected/0bc23e19-9195-4eb3-bed2-e4d34efec7ed-kube-api-access-5mf58\") pod \"model-serving-api-86f7b4b499-vm4rk\" (UID: \"0bc23e19-9195-4eb3-bed2-e4d34efec7ed\") " pod="kserve/model-serving-api-86f7b4b499-vm4rk" Apr 16 15:06:54.879957 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:54.879640 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc23e19-9195-4eb3-bed2-e4d34efec7ed-tls-certs\") pod \"model-serving-api-86f7b4b499-vm4rk\" (UID: \"0bc23e19-9195-4eb3-bed2-e4d34efec7ed\") " pod="kserve/model-serving-api-86f7b4b499-vm4rk" Apr 16 15:06:54.879957 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:06:54.879742 2568 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 15:06:54.879957 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:06:54.879812 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bc23e19-9195-4eb3-bed2-e4d34efec7ed-tls-certs podName:0bc23e19-9195-4eb3-bed2-e4d34efec7ed nodeName:}" failed. No retries permitted until 2026-04-16 15:06:55.379791972 +0000 UTC m=+854.143744060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/0bc23e19-9195-4eb3-bed2-e4d34efec7ed-tls-certs") pod "model-serving-api-86f7b4b499-vm4rk" (UID: "0bc23e19-9195-4eb3-bed2-e4d34efec7ed") : secret "model-serving-api-tls" not found Apr 16 15:06:54.887880 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:54.887852 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mf58\" (UniqueName: \"kubernetes.io/projected/0bc23e19-9195-4eb3-bed2-e4d34efec7ed-kube-api-access-5mf58\") pod \"model-serving-api-86f7b4b499-vm4rk\" (UID: \"0bc23e19-9195-4eb3-bed2-e4d34efec7ed\") " pod="kserve/model-serving-api-86f7b4b499-vm4rk" Apr 16 15:06:55.383460 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:55.383421 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc23e19-9195-4eb3-bed2-e4d34efec7ed-tls-certs\") pod \"model-serving-api-86f7b4b499-vm4rk\" (UID: \"0bc23e19-9195-4eb3-bed2-e4d34efec7ed\") " pod="kserve/model-serving-api-86f7b4b499-vm4rk" Apr 16 15:06:55.385999 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:55.385973 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc23e19-9195-4eb3-bed2-e4d34efec7ed-tls-certs\") pod \"model-serving-api-86f7b4b499-vm4rk\" (UID: \"0bc23e19-9195-4eb3-bed2-e4d34efec7ed\") " pod="kserve/model-serving-api-86f7b4b499-vm4rk" Apr 16 15:06:55.629677 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:55.629646 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-vm4rk" Apr 16 15:06:55.757807 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:55.757781 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-vm4rk"] Apr 16 15:06:55.759705 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:06:55.759675 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bc23e19_9195_4eb3_bed2_e4d34efec7ed.slice/crio-bd2f4c963bb3cad2b1bf38cbe21be59461d713469436b8db22f29d182a309d33 WatchSource:0}: Error finding container bd2f4c963bb3cad2b1bf38cbe21be59461d713469436b8db22f29d182a309d33: Status 404 returned error can't find the container with id bd2f4c963bb3cad2b1bf38cbe21be59461d713469436b8db22f29d182a309d33 Apr 16 15:06:55.955477 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:55.955399 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-vm4rk" event={"ID":"0bc23e19-9195-4eb3-bed2-e4d34efec7ed","Type":"ContainerStarted","Data":"bd2f4c963bb3cad2b1bf38cbe21be59461d713469436b8db22f29d182a309d33"} Apr 16 15:06:57.967368 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:57.967331 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-vm4rk" event={"ID":"0bc23e19-9195-4eb3-bed2-e4d34efec7ed","Type":"ContainerStarted","Data":"cf9261048213853d8c146c89f6d9291e6af2a5221f17a54a241d0cbe1cfa6106"} Apr 16 15:06:57.967764 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:57.967490 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-vm4rk" Apr 16 15:06:57.983135 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:06:57.983091 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-vm4rk" podStartSLOduration=2.562786659 podStartE2EDuration="3.983079989s" podCreationTimestamp="2026-04-16 15:06:54 +0000 UTC" firstStartedPulling="2026-04-16 15:06:55.761537806 +0000 UTC m=+854.525489899" lastFinishedPulling="2026-04-16 15:06:57.181831125 +0000 UTC m=+855.945783229" observedRunningTime="2026-04-16 15:06:57.981833467 +0000 UTC m=+856.745785582" watchObservedRunningTime="2026-04-16 15:06:57.983079989 +0000 UTC m=+856.747032114" Apr 16 15:07:08.975318 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:08.975288 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-vm4rk" Apr 16 15:07:41.811409 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:41.811380 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 15:07:41.816407 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:41.816388 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 15:07:52.149204 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.149171 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8"] Apr 16 15:07:52.153147 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.153127 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.156696 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.156660 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-hzg9m\"" Apr 16 15:07:52.156846 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.156660 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 15:07:52.156846 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.156660 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 15:07:52.156846 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.156660 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:07:52.156846 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.156741 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-dmzth\"" Apr 16 15:07:52.161881 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.161859 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8"] Apr 16 15:07:52.219260 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.219235 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.219394 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.219279 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.219394 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.219314 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.219394 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.219364 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef354028-f175-4c1b-881d-5a5b9a5546ea-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.219525 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.219403 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw8xf\" (UniqueName: \"kubernetes.io/projected/ef354028-f175-4c1b-881d-5a5b9a5546ea-kube-api-access-nw8xf\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.219525 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.219442 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.320616 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.320585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nw8xf\" (UniqueName: \"kubernetes.io/projected/ef354028-f175-4c1b-881d-5a5b9a5546ea-kube-api-access-nw8xf\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.320757 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.320647 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.320757 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.320716 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.320757 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.320757 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.320963 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.320782 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.320963 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.320821 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef354028-f175-4c1b-881d-5a5b9a5546ea-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.321343 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.321271 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.321501 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.321305 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.321618 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.321443 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.321618 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.321522 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.323708 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.323688 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef354028-f175-4c1b-881d-5a5b9a5546ea-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.328247 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.328228 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw8xf\" (UniqueName: \"kubernetes.io/projected/ef354028-f175-4c1b-881d-5a5b9a5546ea-kube-api-access-nw8xf\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.464086 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.464024 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:07:52.799232 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:52.799201 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8"] Apr 16 15:07:52.801394 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:07:52.801370 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef354028_f175_4c1b_881d_5a5b9a5546ea.slice/crio-b417983f0afc9356f75b8f22899a5fa14edbe36083cf384ad8fd2381dd96da36 WatchSource:0}: Error finding container b417983f0afc9356f75b8f22899a5fa14edbe36083cf384ad8fd2381dd96da36: Status 404 returned error can't find the container with id b417983f0afc9356f75b8f22899a5fa14edbe36083cf384ad8fd2381dd96da36 Apr 16 15:07:53.201300 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:53.201264 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" event={"ID":"ef354028-f175-4c1b-881d-5a5b9a5546ea","Type":"ContainerStarted","Data":"b417983f0afc9356f75b8f22899a5fa14edbe36083cf384ad8fd2381dd96da36"} Apr 16 15:07:56.222911 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:56.222859 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" event={"ID":"ef354028-f175-4c1b-881d-5a5b9a5546ea","Type":"ContainerStarted","Data":"abf588c0f77ecc7a3c221fafa3c2c80d32d3b53fdf1647e60c3e47b0bad493a4"} Apr 16 15:07:57.230864 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:57.230780 2568 generic.go:358] "Generic (PLEG): container finished" podID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerID="abf588c0f77ecc7a3c221fafa3c2c80d32d3b53fdf1647e60c3e47b0bad493a4" exitCode=0 Apr 16 15:07:57.231201 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:57.230868 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" event={"ID":"ef354028-f175-4c1b-881d-5a5b9a5546ea","Type":"ContainerDied","Data":"abf588c0f77ecc7a3c221fafa3c2c80d32d3b53fdf1647e60c3e47b0bad493a4"} Apr 16 15:07:59.242486 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:07:59.242446 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" event={"ID":"ef354028-f175-4c1b-881d-5a5b9a5546ea","Type":"ContainerStarted","Data":"73ea9438a004abee6b0d8d11b14a98252cb95db5e99ca4b1d4d3b3ebd82bee97"} Apr 16 15:08:28.377816 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:28.377779 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" event={"ID":"ef354028-f175-4c1b-881d-5a5b9a5546ea","Type":"ContainerStarted","Data":"4c26b2f9f827b19afc27f6ed04cf3019a1a86441019315fc3e10203ac1e36792"} Apr 16 15:08:28.378295 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:28.377933 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:08:28.380029 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:28.380010 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:08:28.400361 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:28.400102 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" podStartSLOduration=1.058432201 podStartE2EDuration="36.400086406s" podCreationTimestamp="2026-04-16 15:07:52 +0000 UTC" firstStartedPulling="2026-04-16 15:07:52.803631904 +0000 UTC m=+911.567583998" lastFinishedPulling="2026-04-16 15:08:28.145286111 +0000 UTC m=+946.909238203" observedRunningTime="2026-04-16 15:08:28.396703501 +0000 UTC m=+947.160655612" watchObservedRunningTime="2026-04-16 15:08:28.400086406 +0000 UTC m=+947.164038518" Apr 16 15:08:32.464623 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:32.464592 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:08:32.464623 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:32.464622 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:08:42.465865 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:42.465836 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:08:42.467004 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:42.466978 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:08:54.883074 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:54.883045 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z"] Apr 16 15:08:55.424032 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.424001 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z"] Apr 16 15:08:55.424208 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.424134 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.427592 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.427566 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-j5pc5\"" Apr 16 15:08:55.427775 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.427632 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 15:08:55.472265 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.472234 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.472389 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.472285 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b22b375-07c4-47fb-b722-68301b084f58-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.472389 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.472359 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.472477 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.472419 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.472477 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.472449 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.472477 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.472470 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8jj2\" (UniqueName: \"kubernetes.io/projected/0b22b375-07c4-47fb-b722-68301b084f58-kube-api-access-f8jj2\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.573220 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.573192 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.573342 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.573231 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.573429 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.573414 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8jj2\" (UniqueName: \"kubernetes.io/projected/0b22b375-07c4-47fb-b722-68301b084f58-kube-api-access-f8jj2\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.573500 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.573484 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.573574 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.573557 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b22b375-07c4-47fb-b722-68301b084f58-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.573630 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.573580 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.573630 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.573614 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.573731 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.573556 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.573905 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.573873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.574001 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.573981 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.575822 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.575797 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b22b375-07c4-47fb-b722-68301b084f58-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.582229 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.582212 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8jj2\" (UniqueName: \"kubernetes.io/projected/0b22b375-07c4-47fb-b722-68301b084f58-kube-api-access-f8jj2\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.742937 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.742858 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:55.876729 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.876700 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z"] Apr 16 15:08:55.879357 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:08:55.879332 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b22b375_07c4_47fb_b722_68301b084f58.slice/crio-322b9904a1ca9da73f8df3b2a8b7ed6ef71af2e6568a14551710a0ee60000328 WatchSource:0}: Error finding container 322b9904a1ca9da73f8df3b2a8b7ed6ef71af2e6568a14551710a0ee60000328: Status 404 returned error can't find the container with id 322b9904a1ca9da73f8df3b2a8b7ed6ef71af2e6568a14551710a0ee60000328 Apr 16 15:08:55.881243 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:55.881226 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:08:56.488293 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:56.488249 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" event={"ID":"0b22b375-07c4-47fb-b722-68301b084f58","Type":"ContainerStarted","Data":"89aae50615f09e8a10a0c692c61103be139374d13a53d2fdd225f01663b564f8"} Apr 16 15:08:56.488293 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:56.488296 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" event={"ID":"0b22b375-07c4-47fb-b722-68301b084f58","Type":"ContainerStarted","Data":"322b9904a1ca9da73f8df3b2a8b7ed6ef71af2e6568a14551710a0ee60000328"} Apr 16 15:08:57.493573 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:57.493536 2568 generic.go:358] "Generic (PLEG): container finished" podID="0b22b375-07c4-47fb-b722-68301b084f58" containerID="89aae50615f09e8a10a0c692c61103be139374d13a53d2fdd225f01663b564f8" exitCode=0 Apr 16 15:08:57.493976 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:57.493623 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" event={"ID":"0b22b375-07c4-47fb-b722-68301b084f58","Type":"ContainerDied","Data":"89aae50615f09e8a10a0c692c61103be139374d13a53d2fdd225f01663b564f8"} Apr 16 15:08:58.500503 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:58.500471 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" event={"ID":"0b22b375-07c4-47fb-b722-68301b084f58","Type":"ContainerStarted","Data":"697a3595a7f92f28c1b985a29636e8fad49ab15f6ec59aa755f17e332b6496b8"} Apr 16 15:08:58.500503 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:58.500505 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" event={"ID":"0b22b375-07c4-47fb-b722-68301b084f58","Type":"ContainerStarted","Data":"35ed36f5be51ec7caa5dfd8e8da1a295b729f8057428b767361c097bd2d73c10"} Apr 16 15:08:58.500909 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:58.500635 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:08:58.522786 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:08:58.522731 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" podStartSLOduration=4.522714139 podStartE2EDuration="4.522714139s" podCreationTimestamp="2026-04-16 15:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:08:58.520813272 +0000 UTC m=+977.284765386" watchObservedRunningTime="2026-04-16 15:08:58.522714139 +0000 UTC m=+977.286666258" Apr 16 15:09:05.744143 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:05.744063 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:09:05.744143 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:05.744099 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:09:05.746738 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:05.746718 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:09:06.539061 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:06.539035 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:09:07.241187 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:07.241154 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8"] Apr 16 15:09:07.241610 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:07.241553 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" podUID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerName="main" containerID="cri-o://73ea9438a004abee6b0d8d11b14a98252cb95db5e99ca4b1d4d3b3ebd82bee97" gracePeriod=30 Apr 16 15:09:07.241796 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:07.241722 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" podUID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerName="tokenizer" containerID="cri-o://4c26b2f9f827b19afc27f6ed04cf3019a1a86441019315fc3e10203ac1e36792" gracePeriod=30 Apr 16 15:09:07.544792 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:07.544707 2568 generic.go:358] "Generic (PLEG): container finished" podID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerID="73ea9438a004abee6b0d8d11b14a98252cb95db5e99ca4b1d4d3b3ebd82bee97" exitCode=0 Apr 16 15:09:07.544792 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:07.544772 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" event={"ID":"ef354028-f175-4c1b-881d-5a5b9a5546ea","Type":"ContainerDied","Data":"73ea9438a004abee6b0d8d11b14a98252cb95db5e99ca4b1d4d3b3ebd82bee97"} Apr 16 15:09:08.401979 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.401955 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:09:08.485806 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.485724 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-tmp\") pod \"ef354028-f175-4c1b-881d-5a5b9a5546ea\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " Apr 16 15:09:08.485806 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.485775 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw8xf\" (UniqueName: \"kubernetes.io/projected/ef354028-f175-4c1b-881d-5a5b9a5546ea-kube-api-access-nw8xf\") pod \"ef354028-f175-4c1b-881d-5a5b9a5546ea\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " Apr 16 15:09:08.486001 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.485814 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef354028-f175-4c1b-881d-5a5b9a5546ea-tls-certs\") pod \"ef354028-f175-4c1b-881d-5a5b9a5546ea\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " Apr 16 15:09:08.486001 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.485931 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-kserve-provision-location\") pod \"ef354028-f175-4c1b-881d-5a5b9a5546ea\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " Apr 16 15:09:08.486001 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.485962 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-uds\") pod \"ef354028-f175-4c1b-881d-5a5b9a5546ea\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " Apr 16 15:09:08.486141 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.486055 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-cache\") pod \"ef354028-f175-4c1b-881d-5a5b9a5546ea\" (UID: \"ef354028-f175-4c1b-881d-5a5b9a5546ea\") " Apr 16 15:09:08.486141 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.486074 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ef354028-f175-4c1b-881d-5a5b9a5546ea" (UID: "ef354028-f175-4c1b-881d-5a5b9a5546ea"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:08.486296 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.486272 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ef354028-f175-4c1b-881d-5a5b9a5546ea" (UID: "ef354028-f175-4c1b-881d-5a5b9a5546ea"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:08.486374 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.486353 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ef354028-f175-4c1b-881d-5a5b9a5546ea" (UID: "ef354028-f175-4c1b-881d-5a5b9a5546ea"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:08.486491 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.486469 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-cache\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:09:08.486555 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.486496 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-tmp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:09:08.486555 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.486510 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-tokenizer-uds\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:09:08.486702 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.486683 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ef354028-f175-4c1b-881d-5a5b9a5546ea" (UID: "ef354028-f175-4c1b-881d-5a5b9a5546ea"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:08.488001 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.487982 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef354028-f175-4c1b-881d-5a5b9a5546ea-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ef354028-f175-4c1b-881d-5a5b9a5546ea" (UID: "ef354028-f175-4c1b-881d-5a5b9a5546ea"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:09:08.488234 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.488216 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef354028-f175-4c1b-881d-5a5b9a5546ea-kube-api-access-nw8xf" (OuterVolumeSpecName: "kube-api-access-nw8xf") pod "ef354028-f175-4c1b-881d-5a5b9a5546ea" (UID: "ef354028-f175-4c1b-881d-5a5b9a5546ea"). InnerVolumeSpecName "kube-api-access-nw8xf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:09:08.550227 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.550200 2568 generic.go:358] "Generic (PLEG): container finished" podID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerID="4c26b2f9f827b19afc27f6ed04cf3019a1a86441019315fc3e10203ac1e36792" exitCode=0 Apr 16 15:09:08.550345 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.550272 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" Apr 16 15:09:08.550406 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.550270 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" event={"ID":"ef354028-f175-4c1b-881d-5a5b9a5546ea","Type":"ContainerDied","Data":"4c26b2f9f827b19afc27f6ed04cf3019a1a86441019315fc3e10203ac1e36792"} Apr 16 15:09:08.550406 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.550376 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" event={"ID":"ef354028-f175-4c1b-881d-5a5b9a5546ea","Type":"ContainerDied","Data":"b417983f0afc9356f75b8f22899a5fa14edbe36083cf384ad8fd2381dd96da36"} Apr 16 15:09:08.550406 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.550394 2568 scope.go:117] "RemoveContainer" containerID="4c26b2f9f827b19afc27f6ed04cf3019a1a86441019315fc3e10203ac1e36792" Apr 16 15:09:08.559071 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.559033 2568 scope.go:117] "RemoveContainer" containerID="73ea9438a004abee6b0d8d11b14a98252cb95db5e99ca4b1d4d3b3ebd82bee97" Apr 16 15:09:08.566748 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.566730 2568 scope.go:117] "RemoveContainer" containerID="abf588c0f77ecc7a3c221fafa3c2c80d32d3b53fdf1647e60c3e47b0bad493a4" Apr 16 15:09:08.572275 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.572252 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8"] Apr 16 15:09:08.575382 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.575368 2568 scope.go:117] "RemoveContainer" containerID="4c26b2f9f827b19afc27f6ed04cf3019a1a86441019315fc3e10203ac1e36792" Apr 16 15:09:08.575615 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:09:08.575597 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c26b2f9f827b19afc27f6ed04cf3019a1a86441019315fc3e10203ac1e36792\": container with ID starting with 4c26b2f9f827b19afc27f6ed04cf3019a1a86441019315fc3e10203ac1e36792 not found: ID does not exist" containerID="4c26b2f9f827b19afc27f6ed04cf3019a1a86441019315fc3e10203ac1e36792" Apr 16 15:09:08.575688 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.575626 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c26b2f9f827b19afc27f6ed04cf3019a1a86441019315fc3e10203ac1e36792"} err="failed to get container status \"4c26b2f9f827b19afc27f6ed04cf3019a1a86441019315fc3e10203ac1e36792\": rpc error: code = NotFound desc = could not find container \"4c26b2f9f827b19afc27f6ed04cf3019a1a86441019315fc3e10203ac1e36792\": container with ID starting with 4c26b2f9f827b19afc27f6ed04cf3019a1a86441019315fc3e10203ac1e36792 not found: ID does not exist" Apr 16 15:09:08.575688 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.575651 2568 scope.go:117] "RemoveContainer" containerID="73ea9438a004abee6b0d8d11b14a98252cb95db5e99ca4b1d4d3b3ebd82bee97" Apr 16 15:09:08.575932 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:09:08.575904 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73ea9438a004abee6b0d8d11b14a98252cb95db5e99ca4b1d4d3b3ebd82bee97\": container with ID starting with 73ea9438a004abee6b0d8d11b14a98252cb95db5e99ca4b1d4d3b3ebd82bee97 not found: ID does not exist" containerID="73ea9438a004abee6b0d8d11b14a98252cb95db5e99ca4b1d4d3b3ebd82bee97" Apr 16 15:09:08.576006 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.575935 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73ea9438a004abee6b0d8d11b14a98252cb95db5e99ca4b1d4d3b3ebd82bee97"} err="failed to get container status \"73ea9438a004abee6b0d8d11b14a98252cb95db5e99ca4b1d4d3b3ebd82bee97\": rpc error: code = NotFound desc = could not find container \"73ea9438a004abee6b0d8d11b14a98252cb95db5e99ca4b1d4d3b3ebd82bee97\": container with ID starting with 73ea9438a004abee6b0d8d11b14a98252cb95db5e99ca4b1d4d3b3ebd82bee97 not found: ID does not exist" Apr 16 15:09:08.576006 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.575954 2568 scope.go:117] "RemoveContainer" containerID="abf588c0f77ecc7a3c221fafa3c2c80d32d3b53fdf1647e60c3e47b0bad493a4" Apr 16 15:09:08.576081 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.576000 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8"] Apr 16 15:09:08.576222 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:09:08.576206 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abf588c0f77ecc7a3c221fafa3c2c80d32d3b53fdf1647e60c3e47b0bad493a4\": container with ID starting with abf588c0f77ecc7a3c221fafa3c2c80d32d3b53fdf1647e60c3e47b0bad493a4 not found: ID does not exist" containerID="abf588c0f77ecc7a3c221fafa3c2c80d32d3b53fdf1647e60c3e47b0bad493a4" Apr 16 15:09:08.576265 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.576228 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf588c0f77ecc7a3c221fafa3c2c80d32d3b53fdf1647e60c3e47b0bad493a4"} err="failed to get container status \"abf588c0f77ecc7a3c221fafa3c2c80d32d3b53fdf1647e60c3e47b0bad493a4\": rpc error: code = NotFound desc = could not find container \"abf588c0f77ecc7a3c221fafa3c2c80d32d3b53fdf1647e60c3e47b0bad493a4\": container with ID starting with abf588c0f77ecc7a3c221fafa3c2c80d32d3b53fdf1647e60c3e47b0bad493a4 not found: ID does not exist" Apr 16 15:09:08.587681 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.587662 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nw8xf\" (UniqueName: \"kubernetes.io/projected/ef354028-f175-4c1b-881d-5a5b9a5546ea-kube-api-access-nw8xf\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:09:08.587744 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.587685 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef354028-f175-4c1b-881d-5a5b9a5546ea-tls-certs\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:09:08.587744 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:08.587697 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef354028-f175-4c1b-881d-5a5b9a5546ea-kserve-provision-location\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:09:09.379587 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:09.379550 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-645d5v9pn8" podUID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.53:9003\" within 1s: context deadline exceeded" Apr 16 15:09:09.379753 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:09:09.379606 2568 logging.go:55] [core] [Channel #44 SubChannel #45]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.53:9003", ServerName: "10.133.0.53:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.53:9003: operation was canceled" Apr 16 15:09:09.863754 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:09.863726 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef354028-f175-4c1b-881d-5a5b9a5546ea" path="/var/lib/kubelet/pods/ef354028-f175-4c1b-881d-5a5b9a5546ea/volumes" Apr 16 15:09:25.483711 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.483674 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr"] Apr 16 15:09:25.484329 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.484310 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerName="tokenizer" Apr 16 15:09:25.484431 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.484331 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerName="tokenizer" Apr 16 15:09:25.484431 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.484346 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerName="main" Apr 16 15:09:25.484431 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.484354 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerName="main" Apr 16 15:09:25.484431 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.484367 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerName="storage-initializer" Apr 16 15:09:25.484431 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.484377 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerName="storage-initializer" Apr 16 15:09:25.484698 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.484517 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerName="main" Apr 16 15:09:25.484698 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.484533 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef354028-f175-4c1b-881d-5a5b9a5546ea" containerName="tokenizer" Apr 16 15:09:25.490109 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.490090 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.492636 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.492614 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-9h4wn\"" Apr 16 15:09:25.493093 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.493073 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 15:09:25.499997 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.499975 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr"] Apr 16 15:09:25.615938 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.615886 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.616095 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.615965 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.616095 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.616085 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.616198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.616139 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.616198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.616172 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.616198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.616191 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8hxf\" (UniqueName: \"kubernetes.io/projected/7b3f0a7d-0e13-458c-89fe-bf27327688b2-kube-api-access-x8hxf\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.717425 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.717401 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.717555 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.717432 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.717555 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.717460 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.717555 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.717495 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8hxf\" (UniqueName: \"kubernetes.io/projected/7b3f0a7d-0e13-458c-89fe-bf27327688b2-kube-api-access-x8hxf\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.717555 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.717523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.717736 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.717557 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.717880 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.717859 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.717956 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.717922 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.717956 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.717937 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.718026 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.717977 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.719963 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.719946 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.724845 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.724820 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8hxf\" (UniqueName: \"kubernetes.io/projected/7b3f0a7d-0e13-458c-89fe-bf27327688b2-kube-api-access-x8hxf\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.801576 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.801512 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:25.948111 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:25.948078 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr"] Apr 16 15:09:25.950017 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:09:25.949985 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b3f0a7d_0e13_458c_89fe_bf27327688b2.slice/crio-2c51cde4bd530d2fe624bd5e5d3878cb6abd5e7069f57dd932c23fa6813b5e4a WatchSource:0}: Error finding container 2c51cde4bd530d2fe624bd5e5d3878cb6abd5e7069f57dd932c23fa6813b5e4a: Status 404 returned error can't find the container with id 2c51cde4bd530d2fe624bd5e5d3878cb6abd5e7069f57dd932c23fa6813b5e4a Apr 16 15:09:26.623774 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:26.623740 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" event={"ID":"7b3f0a7d-0e13-458c-89fe-bf27327688b2","Type":"ContainerStarted","Data":"4a2cfbf433a9918762e43033b64984ef71fef2541c6bf13e8f0c8ce9ea7c6d4c"} Apr 16 15:09:26.623774 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:26.623778 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" event={"ID":"7b3f0a7d-0e13-458c-89fe-bf27327688b2","Type":"ContainerStarted","Data":"2c51cde4bd530d2fe624bd5e5d3878cb6abd5e7069f57dd932c23fa6813b5e4a"} Apr 16 15:09:27.546851 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:27.546825 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:09:27.628930 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:27.628883 2568 generic.go:358] "Generic (PLEG): container finished" podID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" containerID="4a2cfbf433a9918762e43033b64984ef71fef2541c6bf13e8f0c8ce9ea7c6d4c" exitCode=0 Apr 16 15:09:27.628930 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:27.628924 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" event={"ID":"7b3f0a7d-0e13-458c-89fe-bf27327688b2","Type":"ContainerDied","Data":"4a2cfbf433a9918762e43033b64984ef71fef2541c6bf13e8f0c8ce9ea7c6d4c"} Apr 16 15:09:28.635587 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:28.635551 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" event={"ID":"7b3f0a7d-0e13-458c-89fe-bf27327688b2","Type":"ContainerStarted","Data":"583115e7f8c012f5fc81ee6823d33b64f2740ebd4142e4b1ede22015af84445e"} Apr 16 15:09:28.635587 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:28.635589 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" event={"ID":"7b3f0a7d-0e13-458c-89fe-bf27327688b2","Type":"ContainerStarted","Data":"d840a70f7eefb829105610ec6c520d815c4835b9c1fecea27741dbf925e4fda4"} Apr 16 15:09:28.636155 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:28.635702 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:28.657393 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:28.657343 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" podStartSLOduration=3.65733082 podStartE2EDuration="3.65733082s" podCreationTimestamp="2026-04-16 15:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:09:28.653798654 +0000 UTC m=+1007.417750770" watchObservedRunningTime="2026-04-16 15:09:28.65733082 +0000 UTC m=+1007.421282930" Apr 16 15:09:35.801840 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:35.801807 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:35.802276 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:35.801859 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:35.804821 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:35.804796 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:36.679240 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:36.679209 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:57.684166 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:57.684140 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:58.718497 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:58.718467 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr"] Apr 16 15:09:58.718974 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:58.718837 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" podUID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" containerName="main" containerID="cri-o://d840a70f7eefb829105610ec6c520d815c4835b9c1fecea27741dbf925e4fda4" gracePeriod=30 Apr 16 15:09:58.718974 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:58.718865 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" podUID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" containerName="tokenizer" containerID="cri-o://583115e7f8c012f5fc81ee6823d33b64f2740ebd4142e4b1ede22015af84445e" gracePeriod=30 Apr 16 15:09:59.782133 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.782105 2568 generic.go:358] "Generic (PLEG): container finished" podID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" containerID="583115e7f8c012f5fc81ee6823d33b64f2740ebd4142e4b1ede22015af84445e" exitCode=0 Apr 16 15:09:59.782133 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.782131 2568 generic.go:358] "Generic (PLEG): container finished" podID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" containerID="d840a70f7eefb829105610ec6c520d815c4835b9c1fecea27741dbf925e4fda4" exitCode=0 Apr 16 15:09:59.782503 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.782172 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" event={"ID":"7b3f0a7d-0e13-458c-89fe-bf27327688b2","Type":"ContainerDied","Data":"583115e7f8c012f5fc81ee6823d33b64f2740ebd4142e4b1ede22015af84445e"} Apr 16 15:09:59.782503 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.782205 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" event={"ID":"7b3f0a7d-0e13-458c-89fe-bf27327688b2","Type":"ContainerDied","Data":"d840a70f7eefb829105610ec6c520d815c4835b9c1fecea27741dbf925e4fda4"} Apr 16 15:09:59.864062 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.864043 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:09:59.899458 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.899435 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8hxf\" (UniqueName: \"kubernetes.io/projected/7b3f0a7d-0e13-458c-89fe-bf27327688b2-kube-api-access-x8hxf\") pod \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " Apr 16 15:09:59.899585 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.899487 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tls-certs\") pod \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " Apr 16 15:09:59.899585 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.899527 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-tmp\") pod \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " Apr 16 15:09:59.899585 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.899547 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-kserve-provision-location\") pod \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " Apr 16 15:09:59.899585 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.899567 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-cache\") pod \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " Apr 16 15:09:59.899807 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.899643 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-uds\") pod \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\" (UID: \"7b3f0a7d-0e13-458c-89fe-bf27327688b2\") " Apr 16 15:09:59.899864 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.899801 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "7b3f0a7d-0e13-458c-89fe-bf27327688b2" (UID: "7b3f0a7d-0e13-458c-89fe-bf27327688b2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:59.900091 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.900046 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-tmp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:09:59.900180 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.900135 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "7b3f0a7d-0e13-458c-89fe-bf27327688b2" (UID: "7b3f0a7d-0e13-458c-89fe-bf27327688b2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:59.900243 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.900211 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "7b3f0a7d-0e13-458c-89fe-bf27327688b2" (UID: "7b3f0a7d-0e13-458c-89fe-bf27327688b2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:59.900580 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.900551 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7b3f0a7d-0e13-458c-89fe-bf27327688b2" (UID: "7b3f0a7d-0e13-458c-89fe-bf27327688b2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:59.901619 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.901595 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3f0a7d-0e13-458c-89fe-bf27327688b2-kube-api-access-x8hxf" (OuterVolumeSpecName: "kube-api-access-x8hxf") pod "7b3f0a7d-0e13-458c-89fe-bf27327688b2" (UID: "7b3f0a7d-0e13-458c-89fe-bf27327688b2"). InnerVolumeSpecName "kube-api-access-x8hxf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:09:59.901741 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:09:59.901720 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7b3f0a7d-0e13-458c-89fe-bf27327688b2" (UID: "7b3f0a7d-0e13-458c-89fe-bf27327688b2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:10:00.001120 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:00.001099 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-uds\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:10:00.001204 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:00.001122 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x8hxf\" (UniqueName: \"kubernetes.io/projected/7b3f0a7d-0e13-458c-89fe-bf27327688b2-kube-api-access-x8hxf\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:10:00.001204 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:00.001133 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tls-certs\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:10:00.001204 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:00.001144 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-kserve-provision-location\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:10:00.001204 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:00.001152 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b3f0a7d-0e13-458c-89fe-bf27327688b2-tokenizer-cache\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:10:00.788558 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:00.788517 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" event={"ID":"7b3f0a7d-0e13-458c-89fe-bf27327688b2","Type":"ContainerDied","Data":"2c51cde4bd530d2fe624bd5e5d3878cb6abd5e7069f57dd932c23fa6813b5e4a"} Apr 16 15:10:00.788987 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:00.788578 2568 scope.go:117] "RemoveContainer" containerID="583115e7f8c012f5fc81ee6823d33b64f2740ebd4142e4b1ede22015af84445e" Apr 16 15:10:00.788987 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:00.788533 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr" Apr 16 15:10:00.799477 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:00.799460 2568 scope.go:117] "RemoveContainer" containerID="d840a70f7eefb829105610ec6c520d815c4835b9c1fecea27741dbf925e4fda4" Apr 16 15:10:00.815101 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:00.815060 2568 scope.go:117] "RemoveContainer" containerID="4a2cfbf433a9918762e43033b64984ef71fef2541c6bf13e8f0c8ce9ea7c6d4c" Apr 16 15:10:00.815180 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:00.815053 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr"] Apr 16 15:10:00.817935 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:00.817903 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-864cfb7hbhsr"] Apr 16 15:10:01.863515 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:01.863485 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" path="/var/lib/kubelet/pods/7b3f0a7d-0e13-458c-89fe-bf27327688b2/volumes" Apr 16 15:10:12.464607 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.464575 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb"] Apr 16 15:10:12.465040 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.465024 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" containerName="storage-initializer" Apr 16 15:10:12.465040 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.465041 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" containerName="storage-initializer" Apr 16 15:10:12.465116 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.465054 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" containerName="main" Apr 16 15:10:12.465116 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.465062 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" containerName="main" Apr 16 15:10:12.465116 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.465075 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" containerName="tokenizer" Apr 16 15:10:12.465116 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.465080 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" containerName="tokenizer" Apr 16 15:10:12.465239 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.465148 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" containerName="tokenizer" Apr 16 15:10:12.465239 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.465157 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b3f0a7d-0e13-458c-89fe-bf27327688b2" containerName="main" Apr 16 15:10:12.468609 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.468587 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.470979 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.470956 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-hrswr\"" Apr 16 15:10:12.471777 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.471755 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 15:10:12.478268 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.478243 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb"] Apr 16 15:10:12.605040 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.605011 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.605198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.605050 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.605198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.605070 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.605198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.605148 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.605198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.605175 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.605353 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.605219 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87sfg\" (UniqueName: \"kubernetes.io/projected/c830f2e1-04f5-4cbd-986d-0e98d186ab27-kube-api-access-87sfg\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.706018 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.705987 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.706172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.706023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.706172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.706045 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.706172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.706080 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.706172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.706130 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.706381 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.706235 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87sfg\" (UniqueName: \"kubernetes.io/projected/c830f2e1-04f5-4cbd-986d-0e98d186ab27-kube-api-access-87sfg\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.706473 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.706450 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.706557 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.706513 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.706651 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.706615 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.706735 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.706557 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.709220 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.709195 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.713722 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.713700 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87sfg\" (UniqueName: \"kubernetes.io/projected/c830f2e1-04f5-4cbd-986d-0e98d186ab27-kube-api-access-87sfg\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.782139 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.782084 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:12.911202 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:12.911179 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb"] Apr 16 15:10:12.912061 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:10:12.912036 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc830f2e1_04f5_4cbd_986d_0e98d186ab27.slice/crio-9a5725e1a23ebd6bfcc60d7008310f04526187ebba6731bb31d9167e7d15c44f WatchSource:0}: Error finding container 9a5725e1a23ebd6bfcc60d7008310f04526187ebba6731bb31d9167e7d15c44f: Status 404 returned error can't find the container with id 9a5725e1a23ebd6bfcc60d7008310f04526187ebba6731bb31d9167e7d15c44f Apr 16 15:10:13.845365 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:13.845329 2568 generic.go:358] "Generic (PLEG): container finished" podID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerID="ac787c83a6b68a659a3ff71f62c4ec6e511685d72586065ee23c1ee72c68a0de" exitCode=0 Apr 16 15:10:13.845783 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:13.845372 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" event={"ID":"c830f2e1-04f5-4cbd-986d-0e98d186ab27","Type":"ContainerDied","Data":"ac787c83a6b68a659a3ff71f62c4ec6e511685d72586065ee23c1ee72c68a0de"} Apr 16 15:10:13.845783 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:13.845404 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" event={"ID":"c830f2e1-04f5-4cbd-986d-0e98d186ab27","Type":"ContainerStarted","Data":"9a5725e1a23ebd6bfcc60d7008310f04526187ebba6731bb31d9167e7d15c44f"} Apr 16 15:10:14.851503 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:14.851470 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" event={"ID":"c830f2e1-04f5-4cbd-986d-0e98d186ab27","Type":"ContainerStarted","Data":"81237696ffb87eb94a24067baefd7f9a66148efea97340437aacf61dfe537f08"} Apr 16 15:10:14.851503 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:14.851505 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" event={"ID":"c830f2e1-04f5-4cbd-986d-0e98d186ab27","Type":"ContainerStarted","Data":"0add4f6b74137b9db11ab147115e63b211bf3fab081715139392f69f537cb41e"} Apr 16 15:10:14.851942 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:14.851625 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:14.876721 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:14.876669 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" podStartSLOduration=2.876651322 podStartE2EDuration="2.876651322s" podCreationTimestamp="2026-04-16 15:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:10:14.871239267 +0000 UTC m=+1053.635191377" watchObservedRunningTime="2026-04-16 15:10:14.876651322 +0000 UTC m=+1053.640603434" Apr 16 15:10:22.783115 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:22.783072 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:22.783115 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:22.783120 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:22.784383 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:10:22.784356 2568 logging.go:55] [core] [Channel #97 SubChannel #98]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.56:9003", ServerName: "10.133.0.56:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.56:9003: connect: connection refused" Apr 16 15:10:22.785777 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:22.785743 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:22.888237 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:22.888213 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:23.783277 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:23.783236 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.56:9003\" within 1s: context deadline exceeded" Apr 16 15:10:24.896211 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:24.896183 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb_c830f2e1-04f5-4cbd-986d-0e98d186ab27/main/0.log" Apr 16 15:10:24.896629 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:24.896529 2568 generic.go:358] "Generic (PLEG): container finished" podID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerID="0add4f6b74137b9db11ab147115e63b211bf3fab081715139392f69f537cb41e" exitCode=1 Apr 16 15:10:24.896629 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:24.896601 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" event={"ID":"c830f2e1-04f5-4cbd-986d-0e98d186ab27","Type":"ContainerDied","Data":"0add4f6b74137b9db11ab147115e63b211bf3fab081715139392f69f537cb41e"} Apr 16 15:10:24.897010 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:24.896993 2568 scope.go:117] "RemoveContainer" containerID="0add4f6b74137b9db11ab147115e63b211bf3fab081715139392f69f537cb41e" Apr 16 15:10:25.902844 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:25.902815 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb_c830f2e1-04f5-4cbd-986d-0e98d186ab27/main/0.log" Apr 16 15:10:25.903292 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:25.903251 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" event={"ID":"c830f2e1-04f5-4cbd-986d-0e98d186ab27","Type":"ContainerStarted","Data":"113dda5f5e2c4393cdb3be006ba56c23ced0f696988eab1af5e5f62c51010384"} Apr 16 15:10:25.903585 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:25.903564 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:32.782629 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:10:32.782550 2568 logging.go:55] [core] [Channel #105 SubChannel #106]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.56:9003", ServerName: "10.133.0.56:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.56:9003: connect: connection refused" Apr 16 15:10:33.783375 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:33.783330 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.56:9003\" within 1s: context deadline exceeded" Apr 16 15:10:33.783733 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:10:33.783386 2568 logging.go:55] [core] [Channel #105 SubChannel #106]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.56:9003", ServerName: "10.133.0.56:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.56:9003: connect: connection refused" Apr 16 15:10:56.910082 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:56.910055 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:57.880667 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:57.880633 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb"] Apr 16 15:10:57.881020 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:57.880970 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="tokenizer" containerID="cri-o://81237696ffb87eb94a24067baefd7f9a66148efea97340437aacf61dfe537f08" gracePeriod=30 Apr 16 15:10:57.881205 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:57.881023 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="main" containerID="cri-o://113dda5f5e2c4393cdb3be006ba56c23ced0f696988eab1af5e5f62c51010384" gracePeriod=30 Apr 16 15:10:58.038501 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:58.038473 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb_c830f2e1-04f5-4cbd-986d-0e98d186ab27/main/0.log" Apr 16 15:10:58.038860 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:58.038752 2568 generic.go:358] "Generic (PLEG): container finished" podID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerID="113dda5f5e2c4393cdb3be006ba56c23ced0f696988eab1af5e5f62c51010384" exitCode=0 Apr 16 15:10:58.038860 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:58.038822 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" event={"ID":"c830f2e1-04f5-4cbd-986d-0e98d186ab27","Type":"ContainerDied","Data":"113dda5f5e2c4393cdb3be006ba56c23ced0f696988eab1af5e5f62c51010384"} Apr 16 15:10:58.038963 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:58.038865 2568 scope.go:117] "RemoveContainer" containerID="0add4f6b74137b9db11ab147115e63b211bf3fab081715139392f69f537cb41e" Apr 16 15:10:59.238106 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.238088 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:10:59.282120 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.282098 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-kserve-provision-location\") pod \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " Apr 16 15:10:59.282234 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.282140 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-tmp\") pod \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " Apr 16 15:10:59.282234 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.282162 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tls-certs\") pod \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " Apr 16 15:10:59.282234 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.282206 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87sfg\" (UniqueName: \"kubernetes.io/projected/c830f2e1-04f5-4cbd-986d-0e98d186ab27-kube-api-access-87sfg\") pod \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " Apr 16 15:10:59.282404 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.282260 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-cache\") pod \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " Apr 16 15:10:59.282404 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.282276 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-uds\") pod \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\" (UID: \"c830f2e1-04f5-4cbd-986d-0e98d186ab27\") " Apr 16 15:10:59.282511 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.282478 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c830f2e1-04f5-4cbd-986d-0e98d186ab27" (UID: "c830f2e1-04f5-4cbd-986d-0e98d186ab27"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:59.282685 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.282655 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-tmp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:10:59.282801 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.282681 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c830f2e1-04f5-4cbd-986d-0e98d186ab27" (UID: "c830f2e1-04f5-4cbd-986d-0e98d186ab27"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:59.282801 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.282711 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c830f2e1-04f5-4cbd-986d-0e98d186ab27" (UID: "c830f2e1-04f5-4cbd-986d-0e98d186ab27"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:59.283069 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.283046 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c830f2e1-04f5-4cbd-986d-0e98d186ab27" (UID: "c830f2e1-04f5-4cbd-986d-0e98d186ab27"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:59.284835 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.284815 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c830f2e1-04f5-4cbd-986d-0e98d186ab27-kube-api-access-87sfg" (OuterVolumeSpecName: "kube-api-access-87sfg") pod "c830f2e1-04f5-4cbd-986d-0e98d186ab27" (UID: "c830f2e1-04f5-4cbd-986d-0e98d186ab27"). InnerVolumeSpecName "kube-api-access-87sfg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:10:59.284835 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.284821 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c830f2e1-04f5-4cbd-986d-0e98d186ab27" (UID: "c830f2e1-04f5-4cbd-986d-0e98d186ab27"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:10:59.383939 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.383917 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-87sfg\" (UniqueName: \"kubernetes.io/projected/c830f2e1-04f5-4cbd-986d-0e98d186ab27-kube-api-access-87sfg\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:10:59.384032 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.383944 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-cache\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:10:59.384032 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.383959 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tokenizer-uds\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:10:59.384032 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.383969 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c830f2e1-04f5-4cbd-986d-0e98d186ab27-kserve-provision-location\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:10:59.384032 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:10:59.383978 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c830f2e1-04f5-4cbd-986d-0e98d186ab27-tls-certs\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:11:00.051569 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.051544 2568 generic.go:358] "Generic (PLEG): container finished" podID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerID="81237696ffb87eb94a24067baefd7f9a66148efea97340437aacf61dfe537f08" exitCode=0 Apr 16 15:11:00.051681 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.051601 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" event={"ID":"c830f2e1-04f5-4cbd-986d-0e98d186ab27","Type":"ContainerDied","Data":"81237696ffb87eb94a24067baefd7f9a66148efea97340437aacf61dfe537f08"} Apr 16 15:11:00.051681 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.051622 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" Apr 16 15:11:00.051681 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.051635 2568 scope.go:117] "RemoveContainer" containerID="113dda5f5e2c4393cdb3be006ba56c23ced0f696988eab1af5e5f62c51010384" Apr 16 15:11:00.051835 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.051624 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb" event={"ID":"c830f2e1-04f5-4cbd-986d-0e98d186ab27","Type":"ContainerDied","Data":"9a5725e1a23ebd6bfcc60d7008310f04526187ebba6731bb31d9167e7d15c44f"} Apr 16 15:11:00.060835 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.060819 2568 scope.go:117] "RemoveContainer" containerID="81237696ffb87eb94a24067baefd7f9a66148efea97340437aacf61dfe537f08" Apr 16 15:11:00.068416 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.068398 2568 scope.go:117] "RemoveContainer" containerID="ac787c83a6b68a659a3ff71f62c4ec6e511685d72586065ee23c1ee72c68a0de" Apr 16 15:11:00.072123 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.072104 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb"] Apr 16 15:11:00.076327 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.076308 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6b56887cj9mgb"] Apr 16 15:11:00.076811 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.076798 2568 scope.go:117] "RemoveContainer" containerID="113dda5f5e2c4393cdb3be006ba56c23ced0f696988eab1af5e5f62c51010384" Apr 16 15:11:00.077074 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:11:00.077055 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"113dda5f5e2c4393cdb3be006ba56c23ced0f696988eab1af5e5f62c51010384\": container with ID starting with 113dda5f5e2c4393cdb3be006ba56c23ced0f696988eab1af5e5f62c51010384 not found: ID does not exist" containerID="113dda5f5e2c4393cdb3be006ba56c23ced0f696988eab1af5e5f62c51010384" Apr 16 15:11:00.077130 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.077082 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"113dda5f5e2c4393cdb3be006ba56c23ced0f696988eab1af5e5f62c51010384"} err="failed to get container status \"113dda5f5e2c4393cdb3be006ba56c23ced0f696988eab1af5e5f62c51010384\": rpc error: code = NotFound desc = could not find container \"113dda5f5e2c4393cdb3be006ba56c23ced0f696988eab1af5e5f62c51010384\": container with ID starting with 113dda5f5e2c4393cdb3be006ba56c23ced0f696988eab1af5e5f62c51010384 not found: ID does not exist" Apr 16 15:11:00.077130 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.077099 2568 scope.go:117] "RemoveContainer" containerID="81237696ffb87eb94a24067baefd7f9a66148efea97340437aacf61dfe537f08" Apr 16 15:11:00.077353 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:11:00.077336 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81237696ffb87eb94a24067baefd7f9a66148efea97340437aacf61dfe537f08\": container with ID starting with 81237696ffb87eb94a24067baefd7f9a66148efea97340437aacf61dfe537f08 not found: ID does not exist" containerID="81237696ffb87eb94a24067baefd7f9a66148efea97340437aacf61dfe537f08" Apr 16 15:11:00.077417 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.077363 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81237696ffb87eb94a24067baefd7f9a66148efea97340437aacf61dfe537f08"} err="failed to get container status \"81237696ffb87eb94a24067baefd7f9a66148efea97340437aacf61dfe537f08\": rpc error: code = NotFound desc = could not find container \"81237696ffb87eb94a24067baefd7f9a66148efea97340437aacf61dfe537f08\": container with ID starting with 81237696ffb87eb94a24067baefd7f9a66148efea97340437aacf61dfe537f08 not found: ID does not exist" Apr 16 15:11:00.077417 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.077386 2568 scope.go:117] "RemoveContainer" containerID="ac787c83a6b68a659a3ff71f62c4ec6e511685d72586065ee23c1ee72c68a0de" Apr 16 15:11:00.077613 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:11:00.077595 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac787c83a6b68a659a3ff71f62c4ec6e511685d72586065ee23c1ee72c68a0de\": container with ID starting with ac787c83a6b68a659a3ff71f62c4ec6e511685d72586065ee23c1ee72c68a0de not found: ID does not exist" containerID="ac787c83a6b68a659a3ff71f62c4ec6e511685d72586065ee23c1ee72c68a0de" Apr 16 15:11:00.077655 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:00.077618 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac787c83a6b68a659a3ff71f62c4ec6e511685d72586065ee23c1ee72c68a0de"} err="failed to get container status \"ac787c83a6b68a659a3ff71f62c4ec6e511685d72586065ee23c1ee72c68a0de\": rpc error: code = NotFound desc = could not find container \"ac787c83a6b68a659a3ff71f62c4ec6e511685d72586065ee23c1ee72c68a0de\": container with ID starting with ac787c83a6b68a659a3ff71f62c4ec6e511685d72586065ee23c1ee72c68a0de not found: ID does not exist" Apr 16 15:11:01.864495 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:01.864462 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" path="/var/lib/kubelet/pods/c830f2e1-04f5-4cbd-986d-0e98d186ab27/volumes" Apr 16 15:11:37.671933 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:37.671876 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z"] Apr 16 15:11:37.672494 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:37.672289 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" podUID="0b22b375-07c4-47fb-b722-68301b084f58" containerName="main" containerID="cri-o://35ed36f5be51ec7caa5dfd8e8da1a295b729f8057428b767361c097bd2d73c10" gracePeriod=30 Apr 16 15:11:37.672494 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:37.672389 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" podUID="0b22b375-07c4-47fb-b722-68301b084f58" containerName="tokenizer" containerID="cri-o://697a3595a7f92f28c1b985a29636e8fad49ab15f6ec59aa755f17e332b6496b8" gracePeriod=30 Apr 16 15:11:38.210030 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:38.209995 2568 generic.go:358] "Generic (PLEG): container finished" podID="0b22b375-07c4-47fb-b722-68301b084f58" containerID="35ed36f5be51ec7caa5dfd8e8da1a295b729f8057428b767361c097bd2d73c10" exitCode=0 Apr 16 15:11:38.210213 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:38.210070 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" event={"ID":"0b22b375-07c4-47fb-b722-68301b084f58","Type":"ContainerDied","Data":"35ed36f5be51ec7caa5dfd8e8da1a295b729f8057428b767361c097bd2d73c10"} Apr 16 15:11:38.843044 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:38.843023 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:11:39.021594 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.021529 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8jj2\" (UniqueName: \"kubernetes.io/projected/0b22b375-07c4-47fb-b722-68301b084f58-kube-api-access-f8jj2\") pod \"0b22b375-07c4-47fb-b722-68301b084f58\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " Apr 16 15:11:39.021594 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.021566 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-cache\") pod \"0b22b375-07c4-47fb-b722-68301b084f58\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " Apr 16 15:11:39.021594 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.021589 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-uds\") pod \"0b22b375-07c4-47fb-b722-68301b084f58\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " Apr 16 15:11:39.021855 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.021642 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b22b375-07c4-47fb-b722-68301b084f58-tls-certs\") pod \"0b22b375-07c4-47fb-b722-68301b084f58\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " Apr 16 15:11:39.021855 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.021666 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-tmp\") pod \"0b22b375-07c4-47fb-b722-68301b084f58\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " Apr 16 15:11:39.021855 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.021707 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-kserve-provision-location\") pod \"0b22b375-07c4-47fb-b722-68301b084f58\" (UID: \"0b22b375-07c4-47fb-b722-68301b084f58\") " Apr 16 15:11:39.021855 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.021814 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0b22b375-07c4-47fb-b722-68301b084f58" (UID: "0b22b375-07c4-47fb-b722-68301b084f58"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:11:39.022052 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.021924 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0b22b375-07c4-47fb-b722-68301b084f58" (UID: "0b22b375-07c4-47fb-b722-68301b084f58"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:11:39.022052 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.022035 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-cache\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:11:39.022154 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.022055 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-uds\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:11:39.022154 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.022049 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0b22b375-07c4-47fb-b722-68301b084f58" (UID: "0b22b375-07c4-47fb-b722-68301b084f58"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:11:39.022436 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.022417 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0b22b375-07c4-47fb-b722-68301b084f58" (UID: "0b22b375-07c4-47fb-b722-68301b084f58"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:11:39.023621 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.023602 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b22b375-07c4-47fb-b722-68301b084f58-kube-api-access-f8jj2" (OuterVolumeSpecName: "kube-api-access-f8jj2") pod "0b22b375-07c4-47fb-b722-68301b084f58" (UID: "0b22b375-07c4-47fb-b722-68301b084f58"). InnerVolumeSpecName "kube-api-access-f8jj2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:11:39.023688 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.023675 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b22b375-07c4-47fb-b722-68301b084f58-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0b22b375-07c4-47fb-b722-68301b084f58" (UID: "0b22b375-07c4-47fb-b722-68301b084f58"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:11:39.123021 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.122999 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b22b375-07c4-47fb-b722-68301b084f58-tls-certs\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:11:39.123021 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.123021 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-tokenizer-tmp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:11:39.123144 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.123031 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b22b375-07c4-47fb-b722-68301b084f58-kserve-provision-location\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:11:39.123144 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.123040 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f8jj2\" (UniqueName: \"kubernetes.io/projected/0b22b375-07c4-47fb-b722-68301b084f58-kube-api-access-f8jj2\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:11:39.215559 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.215535 2568 generic.go:358] "Generic (PLEG): container finished" podID="0b22b375-07c4-47fb-b722-68301b084f58" containerID="697a3595a7f92f28c1b985a29636e8fad49ab15f6ec59aa755f17e332b6496b8" exitCode=0 Apr 16 15:11:39.215656 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.215602 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" Apr 16 15:11:39.215656 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.215625 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" event={"ID":"0b22b375-07c4-47fb-b722-68301b084f58","Type":"ContainerDied","Data":"697a3595a7f92f28c1b985a29636e8fad49ab15f6ec59aa755f17e332b6496b8"} Apr 16 15:11:39.215776 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.215669 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z" event={"ID":"0b22b375-07c4-47fb-b722-68301b084f58","Type":"ContainerDied","Data":"322b9904a1ca9da73f8df3b2a8b7ed6ef71af2e6568a14551710a0ee60000328"} Apr 16 15:11:39.215776 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.215691 2568 scope.go:117] "RemoveContainer" containerID="697a3595a7f92f28c1b985a29636e8fad49ab15f6ec59aa755f17e332b6496b8" Apr 16 15:11:39.224671 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.224656 2568 scope.go:117] "RemoveContainer" containerID="35ed36f5be51ec7caa5dfd8e8da1a295b729f8057428b767361c097bd2d73c10" Apr 16 15:11:39.232808 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.232785 2568 scope.go:117] "RemoveContainer" containerID="89aae50615f09e8a10a0c692c61103be139374d13a53d2fdd225f01663b564f8" Apr 16 15:11:39.237336 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.237263 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z"] Apr 16 15:11:39.242464 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.242436 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schejw42z"] Apr 16 15:11:39.243197 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.243178 2568 scope.go:117] "RemoveContainer" containerID="697a3595a7f92f28c1b985a29636e8fad49ab15f6ec59aa755f17e332b6496b8" Apr 16 15:11:39.243434 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:11:39.243417 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"697a3595a7f92f28c1b985a29636e8fad49ab15f6ec59aa755f17e332b6496b8\": container with ID starting with 697a3595a7f92f28c1b985a29636e8fad49ab15f6ec59aa755f17e332b6496b8 not found: ID does not exist" containerID="697a3595a7f92f28c1b985a29636e8fad49ab15f6ec59aa755f17e332b6496b8" Apr 16 15:11:39.243594 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.243443 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"697a3595a7f92f28c1b985a29636e8fad49ab15f6ec59aa755f17e332b6496b8"} err="failed to get container status \"697a3595a7f92f28c1b985a29636e8fad49ab15f6ec59aa755f17e332b6496b8\": rpc error: code = NotFound desc = could not find container \"697a3595a7f92f28c1b985a29636e8fad49ab15f6ec59aa755f17e332b6496b8\": container with ID starting with 697a3595a7f92f28c1b985a29636e8fad49ab15f6ec59aa755f17e332b6496b8 not found: ID does not exist" Apr 16 15:11:39.243594 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.243459 2568 scope.go:117] "RemoveContainer" containerID="35ed36f5be51ec7caa5dfd8e8da1a295b729f8057428b767361c097bd2d73c10" Apr 16 15:11:39.243723 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:11:39.243681 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35ed36f5be51ec7caa5dfd8e8da1a295b729f8057428b767361c097bd2d73c10\": container with ID starting with 35ed36f5be51ec7caa5dfd8e8da1a295b729f8057428b767361c097bd2d73c10 not found: ID does not exist" containerID="35ed36f5be51ec7caa5dfd8e8da1a295b729f8057428b767361c097bd2d73c10" Apr 16 15:11:39.243723 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.243704 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ed36f5be51ec7caa5dfd8e8da1a295b729f8057428b767361c097bd2d73c10"} err="failed to get container status \"35ed36f5be51ec7caa5dfd8e8da1a295b729f8057428b767361c097bd2d73c10\": rpc error: code = NotFound desc = could not find container \"35ed36f5be51ec7caa5dfd8e8da1a295b729f8057428b767361c097bd2d73c10\": container with ID starting with 35ed36f5be51ec7caa5dfd8e8da1a295b729f8057428b767361c097bd2d73c10 not found: ID does not exist" Apr 16 15:11:39.243723 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.243720 2568 scope.go:117] "RemoveContainer" containerID="89aae50615f09e8a10a0c692c61103be139374d13a53d2fdd225f01663b564f8" Apr 16 15:11:39.244224 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:11:39.243959 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89aae50615f09e8a10a0c692c61103be139374d13a53d2fdd225f01663b564f8\": container with ID starting with 89aae50615f09e8a10a0c692c61103be139374d13a53d2fdd225f01663b564f8 not found: ID does not exist" containerID="89aae50615f09e8a10a0c692c61103be139374d13a53d2fdd225f01663b564f8" Apr 16 15:11:39.244299 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.244232 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89aae50615f09e8a10a0c692c61103be139374d13a53d2fdd225f01663b564f8"} err="failed to get container status \"89aae50615f09e8a10a0c692c61103be139374d13a53d2fdd225f01663b564f8\": rpc error: code = NotFound desc = could not find container \"89aae50615f09e8a10a0c692c61103be139374d13a53d2fdd225f01663b564f8\": container with ID starting with 89aae50615f09e8a10a0c692c61103be139374d13a53d2fdd225f01663b564f8 not found: ID does not exist" Apr 16 15:11:39.864524 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:39.864494 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b22b375-07c4-47fb-b722-68301b084f58" path="/var/lib/kubelet/pods/0b22b375-07c4-47fb-b722-68301b084f58/volumes" Apr 16 15:11:52.173135 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173098 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl"] Apr 16 15:11:52.173575 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173509 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="storage-initializer" Apr 16 15:11:52.173575 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173520 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="storage-initializer" Apr 16 15:11:52.173575 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173541 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b22b375-07c4-47fb-b722-68301b084f58" containerName="tokenizer" Apr 16 15:11:52.173575 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173549 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b22b375-07c4-47fb-b722-68301b084f58" containerName="tokenizer" Apr 16 15:11:52.173575 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173560 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b22b375-07c4-47fb-b722-68301b084f58" containerName="storage-initializer" Apr 16 15:11:52.173575 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173566 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b22b375-07c4-47fb-b722-68301b084f58" containerName="storage-initializer" Apr 16 15:11:52.173575 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173572 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b22b375-07c4-47fb-b722-68301b084f58" containerName="main" Apr 16 15:11:52.173575 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173577 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b22b375-07c4-47fb-b722-68301b084f58" containerName="main" Apr 16 15:11:52.173833 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173587 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="tokenizer" Apr 16 15:11:52.173833 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173592 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="tokenizer" Apr 16 15:11:52.173833 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173599 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="main" Apr 16 15:11:52.173833 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173604 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="main" Apr 16 15:11:52.173833 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173624 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="main" Apr 16 15:11:52.173833 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173630 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="main" Apr 16 15:11:52.173833 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173688 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b22b375-07c4-47fb-b722-68301b084f58" containerName="main" Apr 16 15:11:52.173833 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173699 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="tokenizer" Apr 16 15:11:52.173833 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173710 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="main" Apr 16 15:11:52.173833 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173718 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b22b375-07c4-47fb-b722-68301b084f58" containerName="tokenizer" Apr 16 15:11:52.174147 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.173846 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c830f2e1-04f5-4cbd-986d-0e98d186ab27" containerName="main" Apr 16 15:11:52.178627 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.178611 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.181991 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.181967 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 15:11:52.181991 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.181979 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 15:11:52.182855 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.182834 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-dmzth\"" Apr 16 15:11:52.183000 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.182858 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:11:52.183000 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.182841 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-vlrwv\"" Apr 16 15:11:52.189753 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.189730 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl"] Apr 16 15:11:52.318573 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.318537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01e92a1a-8568-4c4f-8889-b4278173f6a9-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.318723 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.318596 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.318723 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.318678 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.318844 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.318727 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.318844 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.318748 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.318844 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.318806 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv7zf\" (UniqueName: \"kubernetes.io/projected/01e92a1a-8568-4c4f-8889-b4278173f6a9-kube-api-access-lv7zf\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.420002 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.419968 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.420172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.420045 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.420172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.420103 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.420172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.420128 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.420172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.420170 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lv7zf\" (UniqueName: \"kubernetes.io/projected/01e92a1a-8568-4c4f-8889-b4278173f6a9-kube-api-access-lv7zf\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.420405 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.420215 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01e92a1a-8568-4c4f-8889-b4278173f6a9-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.420465 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.420427 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.420509 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.420463 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.420556 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.420501 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.420596 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.420579 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.423018 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.422993 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01e92a1a-8568-4c4f-8889-b4278173f6a9-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.429132 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.429082 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv7zf\" (UniqueName: \"kubernetes.io/projected/01e92a1a-8568-4c4f-8889-b4278173f6a9-kube-api-access-lv7zf\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.489227 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.489201 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:52.627448 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:52.627408 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl"] Apr 16 15:11:52.628645 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:11:52.628620 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01e92a1a_8568_4c4f_8889_b4278173f6a9.slice/crio-164e7d6de6647a3ef29c7941f811af9dd8be3352e21e2740c99392b7f385378a WatchSource:0}: Error finding container 164e7d6de6647a3ef29c7941f811af9dd8be3352e21e2740c99392b7f385378a: Status 404 returned error can't find the container with id 164e7d6de6647a3ef29c7941f811af9dd8be3352e21e2740c99392b7f385378a Apr 16 15:11:53.274334 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:53.274296 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" event={"ID":"01e92a1a-8568-4c4f-8889-b4278173f6a9","Type":"ContainerStarted","Data":"166647ac2ac1463882c30aa2c26747b8aac8226a98ec279ca49efc4b1fc43195"} Apr 16 15:11:53.274334 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:53.274333 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" event={"ID":"01e92a1a-8568-4c4f-8889-b4278173f6a9","Type":"ContainerStarted","Data":"164e7d6de6647a3ef29c7941f811af9dd8be3352e21e2740c99392b7f385378a"} Apr 16 15:11:54.279934 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:54.279901 2568 generic.go:358] "Generic (PLEG): container finished" podID="01e92a1a-8568-4c4f-8889-b4278173f6a9" containerID="166647ac2ac1463882c30aa2c26747b8aac8226a98ec279ca49efc4b1fc43195" exitCode=0 Apr 16 15:11:54.280285 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:54.279992 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" event={"ID":"01e92a1a-8568-4c4f-8889-b4278173f6a9","Type":"ContainerDied","Data":"166647ac2ac1463882c30aa2c26747b8aac8226a98ec279ca49efc4b1fc43195"} Apr 16 15:11:55.286459 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:55.286425 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" event={"ID":"01e92a1a-8568-4c4f-8889-b4278173f6a9","Type":"ContainerStarted","Data":"9062233b3f3a041f22715b09f8e0e1737d40aa8d9d8c73dca2f1ea98bf2938d1"} Apr 16 15:11:55.286459 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:55.286462 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" event={"ID":"01e92a1a-8568-4c4f-8889-b4278173f6a9","Type":"ContainerStarted","Data":"e6132ec10e81a358b30356d3d5d339cacb59ba4be985b27454197ebfc165d2ec"} Apr 16 15:11:55.286866 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:55.286551 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:11:55.310344 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:11:55.310306 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" podStartSLOduration=3.310293366 podStartE2EDuration="3.310293366s" podCreationTimestamp="2026-04-16 15:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:11:55.3063746 +0000 UTC m=+1154.070326710" watchObservedRunningTime="2026-04-16 15:11:55.310293366 +0000 UTC m=+1154.074245476" Apr 16 15:12:02.489779 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:12:02.489692 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:12:02.490238 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:12:02.489845 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:12:02.492473 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:12:02.492453 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:12:03.321228 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:12:03.321206 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:12:25.329608 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:12:25.329570 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:12:41.850460 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:12:41.850429 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 15:12:41.858293 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:12:41.858271 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 15:13:15.367790 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.367752 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b"] Apr 16 15:13:15.377165 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.377139 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.379712 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.379577 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-tpqv5\"" Apr 16 15:13:15.379712 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.379600 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 15:13:15.382853 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.382829 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b"] Apr 16 15:13:15.492969 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.492946 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.493091 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.492978 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px644\" (UniqueName: \"kubernetes.io/projected/8adce857-bb39-4ce5-a753-be8fed9bc564-kube-api-access-px644\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.493091 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.493013 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.493091 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.493065 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.493091 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.493090 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8adce857-bb39-4ce5-a753-be8fed9bc564-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.493230 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.493124 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.593708 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.593681 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.593869 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.593731 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.593869 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.593755 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-px644\" (UniqueName: \"kubernetes.io/projected/8adce857-bb39-4ce5-a753-be8fed9bc564-kube-api-access-px644\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.593869 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.593807 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.593869 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.593849 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.594276 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.593885 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8adce857-bb39-4ce5-a753-be8fed9bc564-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.594276 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.594223 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.594378 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.594314 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.594378 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.594365 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.594616 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.594557 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.596673 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.596649 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8adce857-bb39-4ce5-a753-be8fed9bc564-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.601336 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.601308 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-px644\" (UniqueName: \"kubernetes.io/projected/8adce857-bb39-4ce5-a753-be8fed9bc564-kube-api-access-px644\") pod \"stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.689293 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.689232 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:15.827108 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:13:15.827066 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8adce857_bb39_4ce5_a753_be8fed9bc564.slice/crio-23ecbe13c27acad5d3ceac3c8b727540ad9f977c56bf0e3d8ef7eb8fc3f1c858 WatchSource:0}: Error finding container 23ecbe13c27acad5d3ceac3c8b727540ad9f977c56bf0e3d8ef7eb8fc3f1c858: Status 404 returned error can't find the container with id 23ecbe13c27acad5d3ceac3c8b727540ad9f977c56bf0e3d8ef7eb8fc3f1c858 Apr 16 15:13:15.834833 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:15.834804 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b"] Apr 16 15:13:16.623967 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:16.623919 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" event={"ID":"8adce857-bb39-4ce5-a753-be8fed9bc564","Type":"ContainerStarted","Data":"79cd470d027e199f0f5982d50909ae89c375d3c8d15e6ad65b5bfb35f1b546db"} Apr 16 15:13:16.623967 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:16.623976 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" event={"ID":"8adce857-bb39-4ce5-a753-be8fed9bc564","Type":"ContainerStarted","Data":"23ecbe13c27acad5d3ceac3c8b727540ad9f977c56bf0e3d8ef7eb8fc3f1c858"} Apr 16 15:13:17.628974 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:17.628939 2568 generic.go:358] "Generic (PLEG): container finished" podID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerID="79cd470d027e199f0f5982d50909ae89c375d3c8d15e6ad65b5bfb35f1b546db" exitCode=0 Apr 16 15:13:17.629338 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:17.629001 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" event={"ID":"8adce857-bb39-4ce5-a753-be8fed9bc564","Type":"ContainerDied","Data":"79cd470d027e199f0f5982d50909ae89c375d3c8d15e6ad65b5bfb35f1b546db"} Apr 16 15:13:18.635180 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:18.635100 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" event={"ID":"8adce857-bb39-4ce5-a753-be8fed9bc564","Type":"ContainerStarted","Data":"5582e0c1b4ff7a25e0eb11ffd7397543f6a836c882bf4a1934534a21b54ae666"} Apr 16 15:13:18.635180 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:18.635137 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" event={"ID":"8adce857-bb39-4ce5-a753-be8fed9bc564","Type":"ContainerStarted","Data":"0d0bdf5426a9c6800eac8022cf3bea6c25c45b5b17b6f7bf68b582babd4e02e4"} Apr 16 15:13:18.635540 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:18.635231 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:18.655489 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:18.655453 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" podStartSLOduration=3.655437291 podStartE2EDuration="3.655437291s" podCreationTimestamp="2026-04-16 15:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:13:18.653767025 +0000 UTC m=+1237.417719136" watchObservedRunningTime="2026-04-16 15:13:18.655437291 +0000 UTC m=+1237.419389403" Apr 16 15:13:25.690125 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:25.690091 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:25.690553 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:25.690142 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:25.693020 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:25.692998 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:26.668581 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:26.668550 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:45.888026 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:45.887935 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl"] Apr 16 15:13:45.888950 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:45.888861 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" podUID="01e92a1a-8568-4c4f-8889-b4278173f6a9" containerName="main" containerID="cri-o://e6132ec10e81a358b30356d3d5d339cacb59ba4be985b27454197ebfc165d2ec" gracePeriod=30 Apr 16 15:13:45.889080 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:45.888957 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" podUID="01e92a1a-8568-4c4f-8889-b4278173f6a9" containerName="tokenizer" containerID="cri-o://9062233b3f3a041f22715b09f8e0e1737d40aa8d9d8c73dca2f1ea98bf2938d1" gracePeriod=30 Apr 16 15:13:46.749636 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:46.749603 2568 generic.go:358] "Generic (PLEG): container finished" podID="01e92a1a-8568-4c4f-8889-b4278173f6a9" containerID="e6132ec10e81a358b30356d3d5d339cacb59ba4be985b27454197ebfc165d2ec" exitCode=0 Apr 16 15:13:46.749907 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:46.749658 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" event={"ID":"01e92a1a-8568-4c4f-8889-b4278173f6a9","Type":"ContainerDied","Data":"e6132ec10e81a358b30356d3d5d339cacb59ba4be985b27454197ebfc165d2ec"} Apr 16 15:13:47.037122 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.037099 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:13:47.156378 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.156353 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-uds\") pod \"01e92a1a-8568-4c4f-8889-b4278173f6a9\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " Apr 16 15:13:47.156492 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.156425 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-kserve-provision-location\") pod \"01e92a1a-8568-4c4f-8889-b4278173f6a9\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " Apr 16 15:13:47.156492 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.156444 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv7zf\" (UniqueName: \"kubernetes.io/projected/01e92a1a-8568-4c4f-8889-b4278173f6a9-kube-api-access-lv7zf\") pod \"01e92a1a-8568-4c4f-8889-b4278173f6a9\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " Apr 16 15:13:47.156492 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.156466 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-tmp\") pod \"01e92a1a-8568-4c4f-8889-b4278173f6a9\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " Apr 16 15:13:47.156642 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.156496 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-cache\") pod \"01e92a1a-8568-4c4f-8889-b4278173f6a9\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " Apr 16 15:13:47.156642 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.156518 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01e92a1a-8568-4c4f-8889-b4278173f6a9-tls-certs\") pod \"01e92a1a-8568-4c4f-8889-b4278173f6a9\" (UID: \"01e92a1a-8568-4c4f-8889-b4278173f6a9\") " Apr 16 15:13:47.156642 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.156624 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "01e92a1a-8568-4c4f-8889-b4278173f6a9" (UID: "01e92a1a-8568-4c4f-8889-b4278173f6a9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:13:47.156805 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.156763 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "01e92a1a-8568-4c4f-8889-b4278173f6a9" (UID: "01e92a1a-8568-4c4f-8889-b4278173f6a9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:13:47.156805 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.156790 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "01e92a1a-8568-4c4f-8889-b4278173f6a9" (UID: "01e92a1a-8568-4c4f-8889-b4278173f6a9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:13:47.156936 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.156921 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-cache\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:13:47.156991 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.156942 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-uds\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:13:47.156991 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.156957 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-tokenizer-tmp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:13:47.157199 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.157176 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "01e92a1a-8568-4c4f-8889-b4278173f6a9" (UID: "01e92a1a-8568-4c4f-8889-b4278173f6a9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:13:47.158514 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.158488 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e92a1a-8568-4c4f-8889-b4278173f6a9-kube-api-access-lv7zf" (OuterVolumeSpecName: "kube-api-access-lv7zf") pod "01e92a1a-8568-4c4f-8889-b4278173f6a9" (UID: "01e92a1a-8568-4c4f-8889-b4278173f6a9"). InnerVolumeSpecName "kube-api-access-lv7zf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:13:47.158630 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.158597 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e92a1a-8568-4c4f-8889-b4278173f6a9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "01e92a1a-8568-4c4f-8889-b4278173f6a9" (UID: "01e92a1a-8568-4c4f-8889-b4278173f6a9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:13:47.257909 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.257871 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01e92a1a-8568-4c4f-8889-b4278173f6a9-tls-certs\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:13:47.258007 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.257914 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01e92a1a-8568-4c4f-8889-b4278173f6a9-kserve-provision-location\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:13:47.258007 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.257926 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lv7zf\" (UniqueName: \"kubernetes.io/projected/01e92a1a-8568-4c4f-8889-b4278173f6a9-kube-api-access-lv7zf\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:13:47.673228 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.673205 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:13:47.757147 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.757119 2568 generic.go:358] "Generic (PLEG): container finished" podID="01e92a1a-8568-4c4f-8889-b4278173f6a9" containerID="9062233b3f3a041f22715b09f8e0e1737d40aa8d9d8c73dca2f1ea98bf2938d1" exitCode=0 Apr 16 15:13:47.757301 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.757183 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" event={"ID":"01e92a1a-8568-4c4f-8889-b4278173f6a9","Type":"ContainerDied","Data":"9062233b3f3a041f22715b09f8e0e1737d40aa8d9d8c73dca2f1ea98bf2938d1"} Apr 16 15:13:47.757301 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.757199 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" Apr 16 15:13:47.757301 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.757208 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl" event={"ID":"01e92a1a-8568-4c4f-8889-b4278173f6a9","Type":"ContainerDied","Data":"164e7d6de6647a3ef29c7941f811af9dd8be3352e21e2740c99392b7f385378a"} Apr 16 15:13:47.757301 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.757223 2568 scope.go:117] "RemoveContainer" containerID="9062233b3f3a041f22715b09f8e0e1737d40aa8d9d8c73dca2f1ea98bf2938d1" Apr 16 15:13:47.770124 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.770103 2568 scope.go:117] "RemoveContainer" containerID="e6132ec10e81a358b30356d3d5d339cacb59ba4be985b27454197ebfc165d2ec" Apr 16 15:13:47.779367 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.779330 2568 scope.go:117] "RemoveContainer" containerID="166647ac2ac1463882c30aa2c26747b8aac8226a98ec279ca49efc4b1fc43195" Apr 16 15:13:47.788694 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.788671 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl"] Apr 16 15:13:47.789286 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.789270 2568 scope.go:117] "RemoveContainer" containerID="9062233b3f3a041f22715b09f8e0e1737d40aa8d9d8c73dca2f1ea98bf2938d1" Apr 16 15:13:47.789508 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:13:47.789490 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9062233b3f3a041f22715b09f8e0e1737d40aa8d9d8c73dca2f1ea98bf2938d1\": container with ID starting with 9062233b3f3a041f22715b09f8e0e1737d40aa8d9d8c73dca2f1ea98bf2938d1 not found: ID does not exist" containerID="9062233b3f3a041f22715b09f8e0e1737d40aa8d9d8c73dca2f1ea98bf2938d1" Apr 16 15:13:47.789559 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.789513 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9062233b3f3a041f22715b09f8e0e1737d40aa8d9d8c73dca2f1ea98bf2938d1"} err="failed to get container status \"9062233b3f3a041f22715b09f8e0e1737d40aa8d9d8c73dca2f1ea98bf2938d1\": rpc error: code = NotFound desc = could not find container \"9062233b3f3a041f22715b09f8e0e1737d40aa8d9d8c73dca2f1ea98bf2938d1\": container with ID starting with 9062233b3f3a041f22715b09f8e0e1737d40aa8d9d8c73dca2f1ea98bf2938d1 not found: ID does not exist" Apr 16 15:13:47.789559 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.789530 2568 scope.go:117] "RemoveContainer" containerID="e6132ec10e81a358b30356d3d5d339cacb59ba4be985b27454197ebfc165d2ec" Apr 16 15:13:47.789759 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:13:47.789743 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6132ec10e81a358b30356d3d5d339cacb59ba4be985b27454197ebfc165d2ec\": container with ID starting with e6132ec10e81a358b30356d3d5d339cacb59ba4be985b27454197ebfc165d2ec not found: ID does not exist" containerID="e6132ec10e81a358b30356d3d5d339cacb59ba4be985b27454197ebfc165d2ec" Apr 16 15:13:47.789810 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.789762 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6132ec10e81a358b30356d3d5d339cacb59ba4be985b27454197ebfc165d2ec"} err="failed to get container status \"e6132ec10e81a358b30356d3d5d339cacb59ba4be985b27454197ebfc165d2ec\": rpc error: code = NotFound desc = could not find container \"e6132ec10e81a358b30356d3d5d339cacb59ba4be985b27454197ebfc165d2ec\": container with ID starting with e6132ec10e81a358b30356d3d5d339cacb59ba4be985b27454197ebfc165d2ec not found: ID does not exist" Apr 16 15:13:47.789810 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.789773 2568 scope.go:117] "RemoveContainer" containerID="166647ac2ac1463882c30aa2c26747b8aac8226a98ec279ca49efc4b1fc43195" Apr 16 15:13:47.789970 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:13:47.789957 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"166647ac2ac1463882c30aa2c26747b8aac8226a98ec279ca49efc4b1fc43195\": container with ID starting with 166647ac2ac1463882c30aa2c26747b8aac8226a98ec279ca49efc4b1fc43195 not found: ID does not exist" containerID="166647ac2ac1463882c30aa2c26747b8aac8226a98ec279ca49efc4b1fc43195" Apr 16 15:13:47.790022 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.789973 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"166647ac2ac1463882c30aa2c26747b8aac8226a98ec279ca49efc4b1fc43195"} err="failed to get container status \"166647ac2ac1463882c30aa2c26747b8aac8226a98ec279ca49efc4b1fc43195\": rpc error: code = NotFound desc = could not find container \"166647ac2ac1463882c30aa2c26747b8aac8226a98ec279ca49efc4b1fc43195\": container with ID starting with 166647ac2ac1463882c30aa2c26747b8aac8226a98ec279ca49efc4b1fc43195 not found: ID does not exist" Apr 16 15:13:47.794116 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.794092 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c96544dkh2gl"] Apr 16 15:13:47.864525 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:13:47.864489 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e92a1a-8568-4c4f-8889-b4278173f6a9" path="/var/lib/kubelet/pods/01e92a1a-8568-4c4f-8889-b4278173f6a9/volumes" Apr 16 15:14:00.652413 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.652377 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn"] Apr 16 15:14:00.652786 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.652773 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01e92a1a-8568-4c4f-8889-b4278173f6a9" containerName="tokenizer" Apr 16 15:14:00.652786 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.652786 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e92a1a-8568-4c4f-8889-b4278173f6a9" containerName="tokenizer" Apr 16 15:14:00.652865 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.652813 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01e92a1a-8568-4c4f-8889-b4278173f6a9" containerName="storage-initializer" Apr 16 15:14:00.652865 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.652819 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e92a1a-8568-4c4f-8889-b4278173f6a9" containerName="storage-initializer" Apr 16 15:14:00.652865 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.652835 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01e92a1a-8568-4c4f-8889-b4278173f6a9" containerName="main" Apr 16 15:14:00.652865 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.652840 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e92a1a-8568-4c4f-8889-b4278173f6a9" containerName="main" Apr 16 15:14:00.653015 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.652918 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="01e92a1a-8568-4c4f-8889-b4278173f6a9" containerName="main" Apr 16 15:14:00.653015 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.652932 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="01e92a1a-8568-4c4f-8889-b4278173f6a9" containerName="tokenizer" Apr 16 15:14:00.656653 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.656638 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.659233 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.659205 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 15:14:00.659351 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.659205 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-rrvcl\"" Apr 16 15:14:00.664909 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.664873 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn"] Apr 16 15:14:00.770690 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.770665 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.770821 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.770701 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.770821 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.770726 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.770924 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.770842 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.770924 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.770913 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.770998 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.770959 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rtdp\" (UniqueName: \"kubernetes.io/projected/36a06bc0-66b9-4415-ba14-d64fa8c49c42-kube-api-access-2rtdp\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.871715 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.871691 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rtdp\" (UniqueName: \"kubernetes.io/projected/36a06bc0-66b9-4415-ba14-d64fa8c49c42-kube-api-access-2rtdp\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.871840 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.871734 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.871840 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.871755 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.871840 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.871781 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.871840 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.871811 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.872097 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.871840 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.872159 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.872142 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.872215 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.872174 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.872275 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.872213 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.872275 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.872247 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.874397 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.874372 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.879383 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.879362 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rtdp\" (UniqueName: \"kubernetes.io/projected/36a06bc0-66b9-4415-ba14-d64fa8c49c42-kube-api-access-2rtdp\") pod \"router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:00.967352 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:00.967293 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:01.100180 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:01.100153 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn"] Apr 16 15:14:01.100671 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:14:01.100643 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36a06bc0_66b9_4415_ba14_d64fa8c49c42.slice/crio-63d10006b9e1ff295a1d567b6aecfec24c7978217d8dcc606eb330d19f243030 WatchSource:0}: Error finding container 63d10006b9e1ff295a1d567b6aecfec24c7978217d8dcc606eb330d19f243030: Status 404 returned error can't find the container with id 63d10006b9e1ff295a1d567b6aecfec24c7978217d8dcc606eb330d19f243030 Apr 16 15:14:01.102560 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:01.102545 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:14:01.817740 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:01.817709 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" event={"ID":"36a06bc0-66b9-4415-ba14-d64fa8c49c42","Type":"ContainerStarted","Data":"da9365dacb3034d136f79bc511483e11cb3b5d0bac917c8c29d4e80c479eba0c"} Apr 16 15:14:01.817740 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:01.817745 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" event={"ID":"36a06bc0-66b9-4415-ba14-d64fa8c49c42","Type":"ContainerStarted","Data":"63d10006b9e1ff295a1d567b6aecfec24c7978217d8dcc606eb330d19f243030"} Apr 16 15:14:02.828179 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:02.828136 2568 generic.go:358] "Generic (PLEG): container finished" podID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerID="da9365dacb3034d136f79bc511483e11cb3b5d0bac917c8c29d4e80c479eba0c" exitCode=0 Apr 16 15:14:02.828630 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:02.828225 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" event={"ID":"36a06bc0-66b9-4415-ba14-d64fa8c49c42","Type":"ContainerDied","Data":"da9365dacb3034d136f79bc511483e11cb3b5d0bac917c8c29d4e80c479eba0c"} Apr 16 15:14:03.835640 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:03.835602 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" event={"ID":"36a06bc0-66b9-4415-ba14-d64fa8c49c42","Type":"ContainerStarted","Data":"ebecaf080dd919f6e0a3d8a409f6e86a2f5a63dcfee500167f9487dae1dc50b6"} Apr 16 15:14:03.835640 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:03.835644 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" event={"ID":"36a06bc0-66b9-4415-ba14-d64fa8c49c42","Type":"ContainerStarted","Data":"1c447788d74b7da0802396dfe5f3f2f4a8236049b77d36e1416aeb78a1738713"} Apr 16 15:14:03.836269 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:03.835700 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:03.857383 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:03.857345 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" podStartSLOduration=3.857333586 podStartE2EDuration="3.857333586s" podCreationTimestamp="2026-04-16 15:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:14:03.853396517 +0000 UTC m=+1282.617348630" watchObservedRunningTime="2026-04-16 15:14:03.857333586 +0000 UTC m=+1282.621285696" Apr 16 15:14:10.968328 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:10.968293 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:10.968328 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:10.968331 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:10.971112 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:10.971082 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:11.884322 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:11.884288 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:14:32.889074 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:14:32.889045 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:15:06.847518 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:06.847444 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b"] Apr 16 15:15:06.848001 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:06.847803 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" podUID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerName="main" containerID="cri-o://0d0bdf5426a9c6800eac8022cf3bea6c25c45b5b17b6f7bf68b582babd4e02e4" gracePeriod=30 Apr 16 15:15:06.848001 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:06.847918 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" podUID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerName="tokenizer" containerID="cri-o://5582e0c1b4ff7a25e0eb11ffd7397543f6a836c882bf4a1934534a21b54ae666" gracePeriod=30 Apr 16 15:15:07.120579 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:07.120494 2568 generic.go:358] "Generic (PLEG): container finished" podID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerID="0d0bdf5426a9c6800eac8022cf3bea6c25c45b5b17b6f7bf68b582babd4e02e4" exitCode=0 Apr 16 15:15:07.120579 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:07.120542 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" event={"ID":"8adce857-bb39-4ce5-a753-be8fed9bc564","Type":"ContainerDied","Data":"0d0bdf5426a9c6800eac8022cf3bea6c25c45b5b17b6f7bf68b582babd4e02e4"} Apr 16 15:15:07.672864 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:15:07.672833 2568 logging.go:55] [core] [Channel #305 SubChannel #306]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.58:9003", ServerName: "10.133.0.58:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.58:9003: connect: connection refused" Apr 16 15:15:08.023442 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.023418 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:15:08.121652 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.121629 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8adce857-bb39-4ce5-a753-be8fed9bc564-tls-certs\") pod \"8adce857-bb39-4ce5-a753-be8fed9bc564\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " Apr 16 15:15:08.121769 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.121662 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-uds\") pod \"8adce857-bb39-4ce5-a753-be8fed9bc564\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " Apr 16 15:15:08.121769 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.121722 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px644\" (UniqueName: \"kubernetes.io/projected/8adce857-bb39-4ce5-a753-be8fed9bc564-kube-api-access-px644\") pod \"8adce857-bb39-4ce5-a753-be8fed9bc564\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " Apr 16 15:15:08.121769 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.121757 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-cache\") pod \"8adce857-bb39-4ce5-a753-be8fed9bc564\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " Apr 16 15:15:08.121955 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.121798 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-tmp\") pod \"8adce857-bb39-4ce5-a753-be8fed9bc564\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " Apr 16 15:15:08.121955 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.121862 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-kserve-provision-location\") pod \"8adce857-bb39-4ce5-a753-be8fed9bc564\" (UID: \"8adce857-bb39-4ce5-a753-be8fed9bc564\") " Apr 16 15:15:08.122185 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.122117 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "8adce857-bb39-4ce5-a753-be8fed9bc564" (UID: "8adce857-bb39-4ce5-a753-be8fed9bc564"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:15:08.122944 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.122194 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "8adce857-bb39-4ce5-a753-be8fed9bc564" (UID: "8adce857-bb39-4ce5-a753-be8fed9bc564"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:15:08.122944 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.122742 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8adce857-bb39-4ce5-a753-be8fed9bc564" (UID: "8adce857-bb39-4ce5-a753-be8fed9bc564"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:15:08.122944 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.122780 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8adce857-bb39-4ce5-a753-be8fed9bc564" (UID: "8adce857-bb39-4ce5-a753-be8fed9bc564"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:15:08.125244 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.125074 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8adce857-bb39-4ce5-a753-be8fed9bc564-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8adce857-bb39-4ce5-a753-be8fed9bc564" (UID: "8adce857-bb39-4ce5-a753-be8fed9bc564"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:15:08.125244 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.125141 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8adce857-bb39-4ce5-a753-be8fed9bc564-kube-api-access-px644" (OuterVolumeSpecName: "kube-api-access-px644") pod "8adce857-bb39-4ce5-a753-be8fed9bc564" (UID: "8adce857-bb39-4ce5-a753-be8fed9bc564"). InnerVolumeSpecName "kube-api-access-px644". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:15:08.127422 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.127392 2568 generic.go:358] "Generic (PLEG): container finished" podID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerID="5582e0c1b4ff7a25e0eb11ffd7397543f6a836c882bf4a1934534a21b54ae666" exitCode=0 Apr 16 15:15:08.127534 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.127440 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" event={"ID":"8adce857-bb39-4ce5-a753-be8fed9bc564","Type":"ContainerDied","Data":"5582e0c1b4ff7a25e0eb11ffd7397543f6a836c882bf4a1934534a21b54ae666"} Apr 16 15:15:08.127534 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.127469 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" event={"ID":"8adce857-bb39-4ce5-a753-be8fed9bc564","Type":"ContainerDied","Data":"23ecbe13c27acad5d3ceac3c8b727540ad9f977c56bf0e3d8ef7eb8fc3f1c858"} Apr 16 15:15:08.127534 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.127478 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" Apr 16 15:15:08.127534 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.127489 2568 scope.go:117] "RemoveContainer" containerID="5582e0c1b4ff7a25e0eb11ffd7397543f6a836c882bf4a1934534a21b54ae666" Apr 16 15:15:08.145643 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.145619 2568 scope.go:117] "RemoveContainer" containerID="0d0bdf5426a9c6800eac8022cf3bea6c25c45b5b17b6f7bf68b582babd4e02e4" Apr 16 15:15:08.157089 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.157069 2568 scope.go:117] "RemoveContainer" containerID="79cd470d027e199f0f5982d50909ae89c375d3c8d15e6ad65b5bfb35f1b546db" Apr 16 15:15:08.158763 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.158334 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b"] Apr 16 15:15:08.162589 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.162569 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b"] Apr 16 15:15:08.165298 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.165273 2568 scope.go:117] "RemoveContainer" containerID="5582e0c1b4ff7a25e0eb11ffd7397543f6a836c882bf4a1934534a21b54ae666" Apr 16 15:15:08.165545 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:15:08.165525 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5582e0c1b4ff7a25e0eb11ffd7397543f6a836c882bf4a1934534a21b54ae666\": container with ID starting with 5582e0c1b4ff7a25e0eb11ffd7397543f6a836c882bf4a1934534a21b54ae666 not found: ID does not exist" containerID="5582e0c1b4ff7a25e0eb11ffd7397543f6a836c882bf4a1934534a21b54ae666" Apr 16 15:15:08.165592 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.165555 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5582e0c1b4ff7a25e0eb11ffd7397543f6a836c882bf4a1934534a21b54ae666"} err="failed to get container status \"5582e0c1b4ff7a25e0eb11ffd7397543f6a836c882bf4a1934534a21b54ae666\": rpc error: code = NotFound desc = could not find container \"5582e0c1b4ff7a25e0eb11ffd7397543f6a836c882bf4a1934534a21b54ae666\": container with ID starting with 5582e0c1b4ff7a25e0eb11ffd7397543f6a836c882bf4a1934534a21b54ae666 not found: ID does not exist" Apr 16 15:15:08.165592 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.165573 2568 scope.go:117] "RemoveContainer" containerID="0d0bdf5426a9c6800eac8022cf3bea6c25c45b5b17b6f7bf68b582babd4e02e4" Apr 16 15:15:08.165812 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:15:08.165787 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d0bdf5426a9c6800eac8022cf3bea6c25c45b5b17b6f7bf68b582babd4e02e4\": container with ID starting with 0d0bdf5426a9c6800eac8022cf3bea6c25c45b5b17b6f7bf68b582babd4e02e4 not found: ID does not exist" containerID="0d0bdf5426a9c6800eac8022cf3bea6c25c45b5b17b6f7bf68b582babd4e02e4" Apr 16 15:15:08.165863 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.165816 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d0bdf5426a9c6800eac8022cf3bea6c25c45b5b17b6f7bf68b582babd4e02e4"} err="failed to get container status \"0d0bdf5426a9c6800eac8022cf3bea6c25c45b5b17b6f7bf68b582babd4e02e4\": rpc error: code = NotFound desc = could not find container \"0d0bdf5426a9c6800eac8022cf3bea6c25c45b5b17b6f7bf68b582babd4e02e4\": container with ID starting with 0d0bdf5426a9c6800eac8022cf3bea6c25c45b5b17b6f7bf68b582babd4e02e4 not found: ID does not exist" Apr 16 15:15:08.165863 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.165832 2568 scope.go:117] "RemoveContainer" containerID="79cd470d027e199f0f5982d50909ae89c375d3c8d15e6ad65b5bfb35f1b546db" Apr 16 15:15:08.166072 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:15:08.166057 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79cd470d027e199f0f5982d50909ae89c375d3c8d15e6ad65b5bfb35f1b546db\": container with ID starting with 79cd470d027e199f0f5982d50909ae89c375d3c8d15e6ad65b5bfb35f1b546db not found: ID does not exist" containerID="79cd470d027e199f0f5982d50909ae89c375d3c8d15e6ad65b5bfb35f1b546db" Apr 16 15:15:08.166116 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.166076 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79cd470d027e199f0f5982d50909ae89c375d3c8d15e6ad65b5bfb35f1b546db"} err="failed to get container status \"79cd470d027e199f0f5982d50909ae89c375d3c8d15e6ad65b5bfb35f1b546db\": rpc error: code = NotFound desc = could not find container \"79cd470d027e199f0f5982d50909ae89c375d3c8d15e6ad65b5bfb35f1b546db\": container with ID starting with 79cd470d027e199f0f5982d50909ae89c375d3c8d15e6ad65b5bfb35f1b546db not found: ID does not exist" Apr 16 15:15:08.222785 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.222753 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-cache\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:15:08.222785 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.222783 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-tmp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:15:08.222785 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.222793 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-kserve-provision-location\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:15:08.222973 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.222802 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8adce857-bb39-4ce5-a753-be8fed9bc564-tls-certs\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:15:08.222973 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.222811 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8adce857-bb39-4ce5-a753-be8fed9bc564-tokenizer-uds\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:15:08.222973 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.222819 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-px644\" (UniqueName: \"kubernetes.io/projected/8adce857-bb39-4ce5-a753-be8fed9bc564-kube-api-access-px644\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:15:08.672964 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:08.672925 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5fdcbc6c9-tkm2b" podUID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.58:9003\" within 1s: context deadline exceeded" Apr 16 15:15:09.864800 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:09.864766 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8adce857-bb39-4ce5-a753-be8fed9bc564" path="/var/lib/kubelet/pods/8adce857-bb39-4ce5-a753-be8fed9bc564/volumes" Apr 16 15:15:44.262376 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:44.262343 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-79498b55f8-rdp2j"] Apr 16 15:15:44.262788 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:44.262656 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" podUID="8b950dd9-f975-496f-a8b7-00ad2c5c54b2" containerName="manager" containerID="cri-o://e1e2de648d9c4918e56d85d9f06ded0ac88a47e09efc753e27c4943c6da8a8c9" gracePeriod=30 Apr 16 15:15:44.518750 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:44.518696 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" Apr 16 15:15:44.612980 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:44.612951 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vrxk\" (UniqueName: \"kubernetes.io/projected/8b950dd9-f975-496f-a8b7-00ad2c5c54b2-kube-api-access-6vrxk\") pod \"8b950dd9-f975-496f-a8b7-00ad2c5c54b2\" (UID: \"8b950dd9-f975-496f-a8b7-00ad2c5c54b2\") " Apr 16 15:15:44.613123 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:44.613011 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b950dd9-f975-496f-a8b7-00ad2c5c54b2-cert\") pod \"8b950dd9-f975-496f-a8b7-00ad2c5c54b2\" (UID: \"8b950dd9-f975-496f-a8b7-00ad2c5c54b2\") " Apr 16 15:15:44.615855 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:44.615816 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b950dd9-f975-496f-a8b7-00ad2c5c54b2-cert" (OuterVolumeSpecName: "cert") pod "8b950dd9-f975-496f-a8b7-00ad2c5c54b2" (UID: "8b950dd9-f975-496f-a8b7-00ad2c5c54b2"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:15:44.615855 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:44.615824 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b950dd9-f975-496f-a8b7-00ad2c5c54b2-kube-api-access-6vrxk" (OuterVolumeSpecName: "kube-api-access-6vrxk") pod "8b950dd9-f975-496f-a8b7-00ad2c5c54b2" (UID: "8b950dd9-f975-496f-a8b7-00ad2c5c54b2"). InnerVolumeSpecName "kube-api-access-6vrxk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:15:44.714107 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:44.714079 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6vrxk\" (UniqueName: \"kubernetes.io/projected/8b950dd9-f975-496f-a8b7-00ad2c5c54b2-kube-api-access-6vrxk\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:15:44.714107 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:44.714105 2568 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b950dd9-f975-496f-a8b7-00ad2c5c54b2-cert\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:15:45.280257 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:45.280224 2568 generic.go:358] "Generic (PLEG): container finished" podID="8b950dd9-f975-496f-a8b7-00ad2c5c54b2" containerID="e1e2de648d9c4918e56d85d9f06ded0ac88a47e09efc753e27c4943c6da8a8c9" exitCode=0 Apr 16 15:15:45.280693 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:45.280288 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" Apr 16 15:15:45.280693 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:45.280309 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" event={"ID":"8b950dd9-f975-496f-a8b7-00ad2c5c54b2","Type":"ContainerDied","Data":"e1e2de648d9c4918e56d85d9f06ded0ac88a47e09efc753e27c4943c6da8a8c9"} Apr 16 15:15:45.280693 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:45.280350 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-79498b55f8-rdp2j" event={"ID":"8b950dd9-f975-496f-a8b7-00ad2c5c54b2","Type":"ContainerDied","Data":"362553bd14b06d93b8195830690169dcfb3a4325e76276552d2470ca965fc188"} Apr 16 15:15:45.280693 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:45.280367 2568 scope.go:117] "RemoveContainer" containerID="e1e2de648d9c4918e56d85d9f06ded0ac88a47e09efc753e27c4943c6da8a8c9" Apr 16 15:15:45.290556 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:45.290538 2568 scope.go:117] "RemoveContainer" containerID="e1e2de648d9c4918e56d85d9f06ded0ac88a47e09efc753e27c4943c6da8a8c9" Apr 16 15:15:45.290835 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:15:45.290809 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e2de648d9c4918e56d85d9f06ded0ac88a47e09efc753e27c4943c6da8a8c9\": container with ID starting with e1e2de648d9c4918e56d85d9f06ded0ac88a47e09efc753e27c4943c6da8a8c9 not found: ID does not exist" containerID="e1e2de648d9c4918e56d85d9f06ded0ac88a47e09efc753e27c4943c6da8a8c9" Apr 16 15:15:45.290883 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:45.290848 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e2de648d9c4918e56d85d9f06ded0ac88a47e09efc753e27c4943c6da8a8c9"} err="failed to get container status \"e1e2de648d9c4918e56d85d9f06ded0ac88a47e09efc753e27c4943c6da8a8c9\": rpc error: code = NotFound desc = could not find container \"e1e2de648d9c4918e56d85d9f06ded0ac88a47e09efc753e27c4943c6da8a8c9\": container with ID starting with e1e2de648d9c4918e56d85d9f06ded0ac88a47e09efc753e27c4943c6da8a8c9 not found: ID does not exist" Apr 16 15:15:45.301575 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:45.301551 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-79498b55f8-rdp2j"] Apr 16 15:15:45.304592 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:45.304572 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-79498b55f8-rdp2j"] Apr 16 15:15:45.863990 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:15:45.863955 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b950dd9-f975-496f-a8b7-00ad2c5c54b2" path="/var/lib/kubelet/pods/8b950dd9-f975-496f-a8b7-00ad2c5c54b2/volumes" Apr 16 15:16:10.369231 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.369196 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm"] Apr 16 15:16:10.369672 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.369656 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerName="tokenizer" Apr 16 15:16:10.369729 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.369674 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerName="tokenizer" Apr 16 15:16:10.369729 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.369685 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b950dd9-f975-496f-a8b7-00ad2c5c54b2" containerName="manager" Apr 16 15:16:10.369729 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.369693 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b950dd9-f975-496f-a8b7-00ad2c5c54b2" containerName="manager" Apr 16 15:16:10.369729 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.369712 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerName="storage-initializer" Apr 16 15:16:10.369729 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.369718 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerName="storage-initializer" Apr 16 15:16:10.369729 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.369724 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerName="main" Apr 16 15:16:10.369729 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.369729 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerName="main" Apr 16 15:16:10.370026 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.369810 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b950dd9-f975-496f-a8b7-00ad2c5c54b2" containerName="manager" Apr 16 15:16:10.370026 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.369820 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerName="tokenizer" Apr 16 15:16:10.370026 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.369828 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="8adce857-bb39-4ce5-a753-be8fed9bc564" containerName="main" Apr 16 15:16:10.373336 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.373313 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.375908 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.375872 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-sjcw7\"" Apr 16 15:16:10.376017 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.375965 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 15:16:10.383241 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.383102 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm"] Apr 16 15:16:10.519365 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.519340 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrds4\" (UniqueName: \"kubernetes.io/projected/610244a3-15e9-45c2-b533-2a700959d598-kube-api-access-nrds4\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.519534 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.519389 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/610244a3-15e9-45c2-b533-2a700959d598-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.519534 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.519456 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.519534 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.519480 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.519534 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.519509 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.519534 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.519533 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.619966 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.619885 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/610244a3-15e9-45c2-b533-2a700959d598-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.620075 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.620052 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.620125 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.620103 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.620192 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.620152 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.620249 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.620198 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.620249 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.620231 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrds4\" (UniqueName: \"kubernetes.io/projected/610244a3-15e9-45c2-b533-2a700959d598-kube-api-access-nrds4\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.620449 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.620425 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.620449 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.620439 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.620548 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.620507 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.620587 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.620543 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.622381 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.622357 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/610244a3-15e9-45c2-b533-2a700959d598-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.628828 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.628807 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrds4\" (UniqueName: \"kubernetes.io/projected/610244a3-15e9-45c2-b533-2a700959d598-kube-api-access-nrds4\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:10.685883 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:10.685862 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:11.020762 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:11.020736 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm"] Apr 16 15:16:11.022746 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:16:11.022713 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod610244a3_15e9_45c2_b533_2a700959d598.slice/crio-d4f60e244400a864e099af98b7c5c747276f787057575764955a48c7623b32e6 WatchSource:0}: Error finding container d4f60e244400a864e099af98b7c5c747276f787057575764955a48c7623b32e6: Status 404 returned error can't find the container with id d4f60e244400a864e099af98b7c5c747276f787057575764955a48c7623b32e6 Apr 16 15:16:11.399547 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:11.398921 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" event={"ID":"610244a3-15e9-45c2-b533-2a700959d598","Type":"ContainerStarted","Data":"37642ef40fb3c2ad234f9c86ffb6bed2cc7a58af5e52e661db6d6b6b05332964"} Apr 16 15:16:11.399547 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:11.398966 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" event={"ID":"610244a3-15e9-45c2-b533-2a700959d598","Type":"ContainerStarted","Data":"d4f60e244400a864e099af98b7c5c747276f787057575764955a48c7623b32e6"} Apr 16 15:16:12.132857 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:12.132826 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn"] Apr 16 15:16:12.133200 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:12.133172 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" podUID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerName="main" containerID="cri-o://1c447788d74b7da0802396dfe5f3f2f4a8236049b77d36e1416aeb78a1738713" gracePeriod=30 Apr 16 15:16:12.133314 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:12.133204 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" podUID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerName="tokenizer" containerID="cri-o://ebecaf080dd919f6e0a3d8a409f6e86a2f5a63dcfee500167f9487dae1dc50b6" gracePeriod=30 Apr 16 15:16:12.406505 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:12.406117 2568 generic.go:358] "Generic (PLEG): container finished" podID="610244a3-15e9-45c2-b533-2a700959d598" containerID="37642ef40fb3c2ad234f9c86ffb6bed2cc7a58af5e52e661db6d6b6b05332964" exitCode=0 Apr 16 15:16:12.406505 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:12.406274 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" event={"ID":"610244a3-15e9-45c2-b533-2a700959d598","Type":"ContainerDied","Data":"37642ef40fb3c2ad234f9c86ffb6bed2cc7a58af5e52e661db6d6b6b05332964"} Apr 16 15:16:12.410572 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:12.410539 2568 generic.go:358] "Generic (PLEG): container finished" podID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerID="1c447788d74b7da0802396dfe5f3f2f4a8236049b77d36e1416aeb78a1738713" exitCode=0 Apr 16 15:16:12.410678 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:12.410585 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" event={"ID":"36a06bc0-66b9-4415-ba14-d64fa8c49c42","Type":"ContainerDied","Data":"1c447788d74b7da0802396dfe5f3f2f4a8236049b77d36e1416aeb78a1738713"} Apr 16 15:16:12.888789 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:16:12.888753 2568 logging.go:55] [core] [Channel #346 SubChannel #347]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.59:9003", ServerName: "10.133.0.59:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.59:9003: connect: connection refused" Apr 16 15:16:13.417107 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.417071 2568 generic.go:358] "Generic (PLEG): container finished" podID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerID="ebecaf080dd919f6e0a3d8a409f6e86a2f5a63dcfee500167f9487dae1dc50b6" exitCode=0 Apr 16 15:16:13.417476 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.417136 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" event={"ID":"36a06bc0-66b9-4415-ba14-d64fa8c49c42","Type":"ContainerDied","Data":"ebecaf080dd919f6e0a3d8a409f6e86a2f5a63dcfee500167f9487dae1dc50b6"} Apr 16 15:16:13.417476 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.417189 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" event={"ID":"36a06bc0-66b9-4415-ba14-d64fa8c49c42","Type":"ContainerDied","Data":"63d10006b9e1ff295a1d567b6aecfec24c7978217d8dcc606eb330d19f243030"} Apr 16 15:16:13.417476 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.417206 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63d10006b9e1ff295a1d567b6aecfec24c7978217d8dcc606eb330d19f243030" Apr 16 15:16:13.419421 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.419398 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" event={"ID":"610244a3-15e9-45c2-b533-2a700959d598","Type":"ContainerStarted","Data":"991e5e1309af8e5d8e41998bc49a65f99b8ad283be542df243df3c2cbb47e339"} Apr 16 15:16:13.419581 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.419562 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" event={"ID":"610244a3-15e9-45c2-b533-2a700959d598","Type":"ContainerStarted","Data":"ef7e43bca915efb7bdcb7f621cb13411e586d492d2ca35ddc886cc7844452cc2"} Apr 16 15:16:13.419706 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.419673 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:13.421048 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.421034 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:16:13.441057 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.441019 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" podStartSLOduration=3.441007046 podStartE2EDuration="3.441007046s" podCreationTimestamp="2026-04-16 15:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:16:13.438162578 +0000 UTC m=+1412.202114688" watchObservedRunningTime="2026-04-16 15:16:13.441007046 +0000 UTC m=+1412.204959157" Apr 16 15:16:13.446168 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.446149 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tls-certs\") pod \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " Apr 16 15:16:13.446263 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.446193 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-cache\") pod \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " Apr 16 15:16:13.446327 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.446301 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rtdp\" (UniqueName: \"kubernetes.io/projected/36a06bc0-66b9-4415-ba14-d64fa8c49c42-kube-api-access-2rtdp\") pod \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " Apr 16 15:16:13.446427 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.446410 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-uds\") pod \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " Apr 16 15:16:13.446503 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.446433 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-tmp\") pod \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " Apr 16 15:16:13.446503 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.446437 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "36a06bc0-66b9-4415-ba14-d64fa8c49c42" (UID: "36a06bc0-66b9-4415-ba14-d64fa8c49c42"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:16:13.446503 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.446453 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-kserve-provision-location\") pod \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\" (UID: \"36a06bc0-66b9-4415-ba14-d64fa8c49c42\") " Apr 16 15:16:13.446672 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.446608 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "36a06bc0-66b9-4415-ba14-d64fa8c49c42" (UID: "36a06bc0-66b9-4415-ba14-d64fa8c49c42"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:16:13.446783 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.446759 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "36a06bc0-66b9-4415-ba14-d64fa8c49c42" (UID: "36a06bc0-66b9-4415-ba14-d64fa8c49c42"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:16:13.446931 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.446771 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-cache\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:16:13.446931 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.446799 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-uds\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:16:13.447259 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.447236 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "36a06bc0-66b9-4415-ba14-d64fa8c49c42" (UID: "36a06bc0-66b9-4415-ba14-d64fa8c49c42"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:16:13.448365 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.448341 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a06bc0-66b9-4415-ba14-d64fa8c49c42-kube-api-access-2rtdp" (OuterVolumeSpecName: "kube-api-access-2rtdp") pod "36a06bc0-66b9-4415-ba14-d64fa8c49c42" (UID: "36a06bc0-66b9-4415-ba14-d64fa8c49c42"). InnerVolumeSpecName "kube-api-access-2rtdp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:16:13.448708 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.448690 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "36a06bc0-66b9-4415-ba14-d64fa8c49c42" (UID: "36a06bc0-66b9-4415-ba14-d64fa8c49c42"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:13.547355 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.547296 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tls-certs\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:16:13.547355 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.547321 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2rtdp\" (UniqueName: \"kubernetes.io/projected/36a06bc0-66b9-4415-ba14-d64fa8c49c42-kube-api-access-2rtdp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:16:13.547355 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.547333 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-tokenizer-tmp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:16:13.547355 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.547342 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36a06bc0-66b9-4415-ba14-d64fa8c49c42-kserve-provision-location\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:16:13.888479 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:13.888448 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" podUID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.59:9003\" within 1s: context deadline exceeded" Apr 16 15:16:14.424828 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:14.424789 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn" Apr 16 15:16:14.442538 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:14.442503 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn"] Apr 16 15:16:14.445423 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:14.445399 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7bf94d4665-cq6mn"] Apr 16 15:16:15.865122 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:15.865085 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" path="/var/lib/kubelet/pods/36a06bc0-66b9-4415-ba14-d64fa8c49c42/volumes" Apr 16 15:16:20.686652 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:20.686620 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:20.686652 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:20.686657 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:20.689431 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:20.689405 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:21.454437 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:21.454413 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:16:42.458486 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:16:42.458464 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:17:41.883089 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:17:41.883059 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 15:17:41.897573 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:17:41.897550 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 15:19:25.083395 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:25.083364 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm"] Apr 16 15:19:25.083837 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:25.083656 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" podUID="610244a3-15e9-45c2-b533-2a700959d598" containerName="main" containerID="cri-o://ef7e43bca915efb7bdcb7f621cb13411e586d492d2ca35ddc886cc7844452cc2" gracePeriod=30 Apr 16 15:19:25.083837 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:25.083698 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" podUID="610244a3-15e9-45c2-b533-2a700959d598" containerName="tokenizer" containerID="cri-o://991e5e1309af8e5d8e41998bc49a65f99b8ad283be542df243df3c2cbb47e339" gracePeriod=30 Apr 16 15:19:25.230266 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:25.230235 2568 generic.go:358] "Generic (PLEG): container finished" podID="610244a3-15e9-45c2-b533-2a700959d598" containerID="ef7e43bca915efb7bdcb7f621cb13411e586d492d2ca35ddc886cc7844452cc2" exitCode=0 Apr 16 15:19:25.230417 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:25.230315 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" event={"ID":"610244a3-15e9-45c2-b533-2a700959d598","Type":"ContainerDied","Data":"ef7e43bca915efb7bdcb7f621cb13411e586d492d2ca35ddc886cc7844452cc2"} Apr 16 15:19:26.231444 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.231419 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:19:26.235157 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.235131 2568 generic.go:358] "Generic (PLEG): container finished" podID="610244a3-15e9-45c2-b533-2a700959d598" containerID="991e5e1309af8e5d8e41998bc49a65f99b8ad283be542df243df3c2cbb47e339" exitCode=0 Apr 16 15:19:26.235251 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.235198 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" Apr 16 15:19:26.235314 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.235255 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" event={"ID":"610244a3-15e9-45c2-b533-2a700959d598","Type":"ContainerDied","Data":"991e5e1309af8e5d8e41998bc49a65f99b8ad283be542df243df3c2cbb47e339"} Apr 16 15:19:26.235314 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.235301 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm" event={"ID":"610244a3-15e9-45c2-b533-2a700959d598","Type":"ContainerDied","Data":"d4f60e244400a864e099af98b7c5c747276f787057575764955a48c7623b32e6"} Apr 16 15:19:26.235426 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.235323 2568 scope.go:117] "RemoveContainer" containerID="991e5e1309af8e5d8e41998bc49a65f99b8ad283be542df243df3c2cbb47e339" Apr 16 15:19:26.244881 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.244864 2568 scope.go:117] "RemoveContainer" containerID="ef7e43bca915efb7bdcb7f621cb13411e586d492d2ca35ddc886cc7844452cc2" Apr 16 15:19:26.253315 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.253220 2568 scope.go:117] "RemoveContainer" containerID="37642ef40fb3c2ad234f9c86ffb6bed2cc7a58af5e52e661db6d6b6b05332964" Apr 16 15:19:26.262060 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.262043 2568 scope.go:117] "RemoveContainer" containerID="991e5e1309af8e5d8e41998bc49a65f99b8ad283be542df243df3c2cbb47e339" Apr 16 15:19:26.262285 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:19:26.262267 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"991e5e1309af8e5d8e41998bc49a65f99b8ad283be542df243df3c2cbb47e339\": container with ID starting with 991e5e1309af8e5d8e41998bc49a65f99b8ad283be542df243df3c2cbb47e339 not found: ID does not exist" containerID="991e5e1309af8e5d8e41998bc49a65f99b8ad283be542df243df3c2cbb47e339" Apr 16 15:19:26.262361 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.262294 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"991e5e1309af8e5d8e41998bc49a65f99b8ad283be542df243df3c2cbb47e339"} err="failed to get container status \"991e5e1309af8e5d8e41998bc49a65f99b8ad283be542df243df3c2cbb47e339\": rpc error: code = NotFound desc = could not find container \"991e5e1309af8e5d8e41998bc49a65f99b8ad283be542df243df3c2cbb47e339\": container with ID starting with 991e5e1309af8e5d8e41998bc49a65f99b8ad283be542df243df3c2cbb47e339 not found: ID does not exist" Apr 16 15:19:26.262361 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.262317 2568 scope.go:117] "RemoveContainer" containerID="ef7e43bca915efb7bdcb7f621cb13411e586d492d2ca35ddc886cc7844452cc2" Apr 16 15:19:26.262554 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:19:26.262537 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef7e43bca915efb7bdcb7f621cb13411e586d492d2ca35ddc886cc7844452cc2\": container with ID starting with ef7e43bca915efb7bdcb7f621cb13411e586d492d2ca35ddc886cc7844452cc2 not found: ID does not exist" containerID="ef7e43bca915efb7bdcb7f621cb13411e586d492d2ca35ddc886cc7844452cc2" Apr 16 15:19:26.262621 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.262558 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7e43bca915efb7bdcb7f621cb13411e586d492d2ca35ddc886cc7844452cc2"} err="failed to get container status \"ef7e43bca915efb7bdcb7f621cb13411e586d492d2ca35ddc886cc7844452cc2\": rpc error: code = NotFound desc = could not find container \"ef7e43bca915efb7bdcb7f621cb13411e586d492d2ca35ddc886cc7844452cc2\": container with ID starting with ef7e43bca915efb7bdcb7f621cb13411e586d492d2ca35ddc886cc7844452cc2 not found: ID does not exist" Apr 16 15:19:26.262621 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.262583 2568 scope.go:117] "RemoveContainer" containerID="37642ef40fb3c2ad234f9c86ffb6bed2cc7a58af5e52e661db6d6b6b05332964" Apr 16 15:19:26.262842 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:19:26.262809 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37642ef40fb3c2ad234f9c86ffb6bed2cc7a58af5e52e661db6d6b6b05332964\": container with ID starting with 37642ef40fb3c2ad234f9c86ffb6bed2cc7a58af5e52e661db6d6b6b05332964 not found: ID does not exist" containerID="37642ef40fb3c2ad234f9c86ffb6bed2cc7a58af5e52e661db6d6b6b05332964" Apr 16 15:19:26.262942 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.262852 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37642ef40fb3c2ad234f9c86ffb6bed2cc7a58af5e52e661db6d6b6b05332964"} err="failed to get container status \"37642ef40fb3c2ad234f9c86ffb6bed2cc7a58af5e52e661db6d6b6b05332964\": rpc error: code = NotFound desc = could not find container \"37642ef40fb3c2ad234f9c86ffb6bed2cc7a58af5e52e661db6d6b6b05332964\": container with ID starting with 37642ef40fb3c2ad234f9c86ffb6bed2cc7a58af5e52e661db6d6b6b05332964 not found: ID does not exist" Apr 16 15:19:26.328763 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.328743 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-uds\") pod \"610244a3-15e9-45c2-b533-2a700959d598\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " Apr 16 15:19:26.328847 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.328786 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrds4\" (UniqueName: \"kubernetes.io/projected/610244a3-15e9-45c2-b533-2a700959d598-kube-api-access-nrds4\") pod \"610244a3-15e9-45c2-b533-2a700959d598\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " Apr 16 15:19:26.328847 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.328820 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/610244a3-15e9-45c2-b533-2a700959d598-tls-certs\") pod \"610244a3-15e9-45c2-b533-2a700959d598\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " Apr 16 15:19:26.328957 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.328865 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-kserve-provision-location\") pod \"610244a3-15e9-45c2-b533-2a700959d598\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " Apr 16 15:19:26.328957 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.328908 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-tmp\") pod \"610244a3-15e9-45c2-b533-2a700959d598\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " Apr 16 15:19:26.328957 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.328949 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-cache\") pod \"610244a3-15e9-45c2-b533-2a700959d598\" (UID: \"610244a3-15e9-45c2-b533-2a700959d598\") " Apr 16 15:19:26.329087 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.329021 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "610244a3-15e9-45c2-b533-2a700959d598" (UID: "610244a3-15e9-45c2-b533-2a700959d598"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:26.329241 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.329208 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "610244a3-15e9-45c2-b533-2a700959d598" (UID: "610244a3-15e9-45c2-b533-2a700959d598"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:26.329241 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.329235 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-uds\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:19:26.329348 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.329301 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "610244a3-15e9-45c2-b533-2a700959d598" (UID: "610244a3-15e9-45c2-b533-2a700959d598"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:26.329695 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.329668 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "610244a3-15e9-45c2-b533-2a700959d598" (UID: "610244a3-15e9-45c2-b533-2a700959d598"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:19:26.330917 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.330877 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/610244a3-15e9-45c2-b533-2a700959d598-kube-api-access-nrds4" (OuterVolumeSpecName: "kube-api-access-nrds4") pod "610244a3-15e9-45c2-b533-2a700959d598" (UID: "610244a3-15e9-45c2-b533-2a700959d598"). InnerVolumeSpecName "kube-api-access-nrds4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:19:26.330992 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.330950 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/610244a3-15e9-45c2-b533-2a700959d598-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "610244a3-15e9-45c2-b533-2a700959d598" (UID: "610244a3-15e9-45c2-b533-2a700959d598"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:19:26.430106 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.430082 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-kserve-provision-location\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:19:26.430106 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.430104 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-tmp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:19:26.430242 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.430118 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/610244a3-15e9-45c2-b533-2a700959d598-tokenizer-cache\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:19:26.430242 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.430130 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nrds4\" (UniqueName: \"kubernetes.io/projected/610244a3-15e9-45c2-b533-2a700959d598-kube-api-access-nrds4\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:19:26.430242 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.430146 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/610244a3-15e9-45c2-b533-2a700959d598-tls-certs\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:19:26.557790 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.557760 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm"] Apr 16 15:19:26.561701 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:26.561673 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schedb5vm"] Apr 16 15:19:27.864145 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:27.864118 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="610244a3-15e9-45c2-b533-2a700959d598" path="/var/lib/kubelet/pods/610244a3-15e9-45c2-b533-2a700959d598/volumes" Apr 16 15:19:36.919466 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919390 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c"] Apr 16 15:19:36.919906 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919750 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerName="main" Apr 16 15:19:36.919906 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919760 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerName="main" Apr 16 15:19:36.919906 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919776 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="610244a3-15e9-45c2-b533-2a700959d598" containerName="storage-initializer" Apr 16 15:19:36.919906 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919783 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="610244a3-15e9-45c2-b533-2a700959d598" containerName="storage-initializer" Apr 16 15:19:36.919906 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919791 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerName="tokenizer" Apr 16 15:19:36.919906 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919797 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerName="tokenizer" Apr 16 15:19:36.919906 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919806 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerName="storage-initializer" Apr 16 15:19:36.919906 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919812 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerName="storage-initializer" Apr 16 15:19:36.919906 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919824 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="610244a3-15e9-45c2-b533-2a700959d598" containerName="tokenizer" Apr 16 15:19:36.919906 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919829 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="610244a3-15e9-45c2-b533-2a700959d598" containerName="tokenizer" Apr 16 15:19:36.919906 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919838 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="610244a3-15e9-45c2-b533-2a700959d598" containerName="main" Apr 16 15:19:36.919906 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919843 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="610244a3-15e9-45c2-b533-2a700959d598" containerName="main" Apr 16 15:19:36.919906 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919912 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="610244a3-15e9-45c2-b533-2a700959d598" containerName="tokenizer" Apr 16 15:19:36.920310 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919922 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="610244a3-15e9-45c2-b533-2a700959d598" containerName="main" Apr 16 15:19:36.920310 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919929 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerName="main" Apr 16 15:19:36.920310 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.919937 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="36a06bc0-66b9-4415-ba14-d64fa8c49c42" containerName="tokenizer" Apr 16 15:19:36.924815 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.924794 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:36.927050 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.927022 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:19:36.927164 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.927068 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 15:19:36.927164 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.927134 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-dmzth\"" Apr 16 15:19:36.928200 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.928178 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-49482\"" Apr 16 15:19:36.928306 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.928259 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 15:19:36.934940 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:36.934919 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c"] Apr 16 15:19:37.013363 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.013336 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.013483 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.013373 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.013483 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.013396 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.013483 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.013469 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.013590 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.013524 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.013590 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.013560 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbpbr\" (UniqueName: \"kubernetes.io/projected/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-kube-api-access-lbpbr\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.114422 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.114395 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.114585 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.114435 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbpbr\" (UniqueName: \"kubernetes.io/projected/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-kube-api-access-lbpbr\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.114585 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.114468 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.114585 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.114499 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.114796 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.114719 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.114861 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.114808 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.114861 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.114844 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.115042 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.115017 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.115133 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.115113 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.115198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.115178 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.117031 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.117012 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.124597 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.124570 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbpbr\" (UniqueName: \"kubernetes.io/projected/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-kube-api-access-lbpbr\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.236065 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.236009 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:37.368587 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.368562 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c"] Apr 16 15:19:37.370613 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:19:37.370586 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9e4b2dd_1e61_48dc_aa73_cb6994f4220c.slice/crio-0d868198c1b7a6bd380e247e63f60675c10f831b6f94ca677ab5f3793a1fb1e1 WatchSource:0}: Error finding container 0d868198c1b7a6bd380e247e63f60675c10f831b6f94ca677ab5f3793a1fb1e1: Status 404 returned error can't find the container with id 0d868198c1b7a6bd380e247e63f60675c10f831b6f94ca677ab5f3793a1fb1e1 Apr 16 15:19:37.372754 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:37.372737 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:19:38.292491 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:38.292461 2568 generic.go:358] "Generic (PLEG): container finished" podID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerID="17ddc4c837c5fbc063e2835aea7c58e9e953cb6b24f91905e63d98145f46df56" exitCode=0 Apr 16 15:19:38.292843 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:38.292547 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" event={"ID":"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c","Type":"ContainerDied","Data":"17ddc4c837c5fbc063e2835aea7c58e9e953cb6b24f91905e63d98145f46df56"} Apr 16 15:19:38.292843 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:38.292578 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" event={"ID":"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c","Type":"ContainerStarted","Data":"0d868198c1b7a6bd380e247e63f60675c10f831b6f94ca677ab5f3793a1fb1e1"} Apr 16 15:19:39.299301 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:39.299268 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" event={"ID":"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c","Type":"ContainerStarted","Data":"8ca383b04563434244e594490e8058f1c7e66395ca7d31278a9e7b12b49e815a"} Apr 16 15:19:39.299301 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:39.299301 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" event={"ID":"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c","Type":"ContainerStarted","Data":"181e0477dab760f75c548d9a42cb1a9c4cb18009414f46f19684c0717f6873f6"} Apr 16 15:19:39.299727 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:39.299414 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:39.318537 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:39.318498 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" podStartSLOduration=3.318487667 podStartE2EDuration="3.318487667s" podCreationTimestamp="2026-04-16 15:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:19:39.317176991 +0000 UTC m=+1618.081129103" watchObservedRunningTime="2026-04-16 15:19:39.318487667 +0000 UTC m=+1618.082439756" Apr 16 15:19:47.236506 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:47.236466 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:47.237002 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:47.236520 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:47.239522 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:47.239495 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:19:47.332228 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:19:47.332203 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:20:08.338236 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:08.338201 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:20:27.126034 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.126005 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp"] Apr 16 15:20:27.129923 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.129903 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.132568 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.132543 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-2n446\"" Apr 16 15:20:27.132679 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.132633 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 15:20:27.142258 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.142237 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp"] Apr 16 15:20:27.222369 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.222344 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.222490 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.222382 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07d29e91-6707-40cc-a1ee-199b459a7a6e-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.222490 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.222415 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh6wj\" (UniqueName: \"kubernetes.io/projected/07d29e91-6707-40cc-a1ee-199b459a7a6e-kube-api-access-hh6wj\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.222588 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.222488 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.222588 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.222543 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.222665 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.222595 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.323521 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.323487 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.323656 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.323534 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07d29e91-6707-40cc-a1ee-199b459a7a6e-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.323656 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.323569 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hh6wj\" (UniqueName: \"kubernetes.io/projected/07d29e91-6707-40cc-a1ee-199b459a7a6e-kube-api-access-hh6wj\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.323656 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.323599 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.323656 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.323623 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.323877 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.323681 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.324018 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.323992 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.324076 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.324036 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.324124 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.324073 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.324165 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.324136 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.326103 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.326085 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07d29e91-6707-40cc-a1ee-199b459a7a6e-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.331129 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.331107 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh6wj\" (UniqueName: \"kubernetes.io/projected/07d29e91-6707-40cc-a1ee-199b459a7a6e-kube-api-access-hh6wj\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.442546 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.442485 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:27.577976 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:27.577947 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp"] Apr 16 15:20:27.580657 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:20:27.580625 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07d29e91_6707_40cc_a1ee_199b459a7a6e.slice/crio-7b1b69c5086be35164814406ee7b9e1ccb4641ec13b8394d3db481ba5d6f47c1 WatchSource:0}: Error finding container 7b1b69c5086be35164814406ee7b9e1ccb4641ec13b8394d3db481ba5d6f47c1: Status 404 returned error can't find the container with id 7b1b69c5086be35164814406ee7b9e1ccb4641ec13b8394d3db481ba5d6f47c1 Apr 16 15:20:28.514825 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:28.514794 2568 generic.go:358] "Generic (PLEG): container finished" podID="07d29e91-6707-40cc-a1ee-199b459a7a6e" containerID="2703cacaf3535a20f2c6e9c2746e558b7d7bfcad5d89adfd5611a21c033f95b3" exitCode=0 Apr 16 15:20:28.515196 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:28.514879 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" event={"ID":"07d29e91-6707-40cc-a1ee-199b459a7a6e","Type":"ContainerDied","Data":"2703cacaf3535a20f2c6e9c2746e558b7d7bfcad5d89adfd5611a21c033f95b3"} Apr 16 15:20:28.515196 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:28.514935 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" event={"ID":"07d29e91-6707-40cc-a1ee-199b459a7a6e","Type":"ContainerStarted","Data":"7b1b69c5086be35164814406ee7b9e1ccb4641ec13b8394d3db481ba5d6f47c1"} Apr 16 15:20:29.521422 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:29.521386 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" event={"ID":"07d29e91-6707-40cc-a1ee-199b459a7a6e","Type":"ContainerStarted","Data":"3d8ec2b5a053e2f6f4f7d186ed11258633acc9ba322f90ea5eb0796ae278ad64"} Apr 16 15:20:29.521422 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:29.521429 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" event={"ID":"07d29e91-6707-40cc-a1ee-199b459a7a6e","Type":"ContainerStarted","Data":"d341bced0dd24a96107f88f9147c0765e5dad298ec533f762076b9f092958de8"} Apr 16 15:20:29.521994 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:29.521499 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:29.540882 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:29.540831 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" podStartSLOduration=2.540815652 podStartE2EDuration="2.540815652s" podCreationTimestamp="2026-04-16 15:20:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:20:29.539461057 +0000 UTC m=+1668.303413159" watchObservedRunningTime="2026-04-16 15:20:29.540815652 +0000 UTC m=+1668.304767765" Apr 16 15:20:37.443667 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:37.443636 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:37.443667 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:37.443671 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:37.446243 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:37.446217 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:37.555030 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:37.555008 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:20:42.334035 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:42.334003 2568 scope.go:117] "RemoveContainer" containerID="1c447788d74b7da0802396dfe5f3f2f4a8236049b77d36e1416aeb78a1738713" Apr 16 15:20:42.343116 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:42.343088 2568 scope.go:117] "RemoveContainer" containerID="da9365dacb3034d136f79bc511483e11cb3b5d0bac917c8c29d4e80c479eba0c" Apr 16 15:20:42.353125 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:42.353104 2568 scope.go:117] "RemoveContainer" containerID="ebecaf080dd919f6e0a3d8a409f6e86a2f5a63dcfee500167f9487dae1dc50b6" Apr 16 15:20:58.559694 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:20:58.559659 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:22:41.916185 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:22:41.916157 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 15:22:41.932327 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:22:41.932301 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 15:22:58.976753 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:22:58.976717 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp"] Apr 16 15:22:58.977154 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:22:58.977128 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" podUID="07d29e91-6707-40cc-a1ee-199b459a7a6e" containerName="main" containerID="cri-o://d341bced0dd24a96107f88f9147c0765e5dad298ec533f762076b9f092958de8" gracePeriod=30 Apr 16 15:22:58.977231 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:22:58.977150 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" podUID="07d29e91-6707-40cc-a1ee-199b459a7a6e" containerName="tokenizer" containerID="cri-o://3d8ec2b5a053e2f6f4f7d186ed11258633acc9ba322f90ea5eb0796ae278ad64" gracePeriod=30 Apr 16 15:22:59.142809 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:22:59.142777 2568 generic.go:358] "Generic (PLEG): container finished" podID="07d29e91-6707-40cc-a1ee-199b459a7a6e" containerID="d341bced0dd24a96107f88f9147c0765e5dad298ec533f762076b9f092958de8" exitCode=0 Apr 16 15:22:59.142989 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:22:59.142846 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" event={"ID":"07d29e91-6707-40cc-a1ee-199b459a7a6e","Type":"ContainerDied","Data":"d341bced0dd24a96107f88f9147c0765e5dad298ec533f762076b9f092958de8"} Apr 16 15:23:00.134758 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.134737 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:23:00.149319 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.149288 2568 generic.go:358] "Generic (PLEG): container finished" podID="07d29e91-6707-40cc-a1ee-199b459a7a6e" containerID="3d8ec2b5a053e2f6f4f7d186ed11258633acc9ba322f90ea5eb0796ae278ad64" exitCode=0 Apr 16 15:23:00.149437 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.149329 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" event={"ID":"07d29e91-6707-40cc-a1ee-199b459a7a6e","Type":"ContainerDied","Data":"3d8ec2b5a053e2f6f4f7d186ed11258633acc9ba322f90ea5eb0796ae278ad64"} Apr 16 15:23:00.149437 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.149356 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" Apr 16 15:23:00.149437 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.149373 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp" event={"ID":"07d29e91-6707-40cc-a1ee-199b459a7a6e","Type":"ContainerDied","Data":"7b1b69c5086be35164814406ee7b9e1ccb4641ec13b8394d3db481ba5d6f47c1"} Apr 16 15:23:00.149437 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.149398 2568 scope.go:117] "RemoveContainer" containerID="3d8ec2b5a053e2f6f4f7d186ed11258633acc9ba322f90ea5eb0796ae278ad64" Apr 16 15:23:00.158743 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.158721 2568 scope.go:117] "RemoveContainer" containerID="d341bced0dd24a96107f88f9147c0765e5dad298ec533f762076b9f092958de8" Apr 16 15:23:00.167958 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.167941 2568 scope.go:117] "RemoveContainer" containerID="2703cacaf3535a20f2c6e9c2746e558b7d7bfcad5d89adfd5611a21c033f95b3" Apr 16 15:23:00.176021 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.175995 2568 scope.go:117] "RemoveContainer" containerID="3d8ec2b5a053e2f6f4f7d186ed11258633acc9ba322f90ea5eb0796ae278ad64" Apr 16 15:23:00.176319 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:23:00.176289 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d8ec2b5a053e2f6f4f7d186ed11258633acc9ba322f90ea5eb0796ae278ad64\": container with ID starting with 3d8ec2b5a053e2f6f4f7d186ed11258633acc9ba322f90ea5eb0796ae278ad64 not found: ID does not exist" containerID="3d8ec2b5a053e2f6f4f7d186ed11258633acc9ba322f90ea5eb0796ae278ad64" Apr 16 15:23:00.176412 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.176325 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d8ec2b5a053e2f6f4f7d186ed11258633acc9ba322f90ea5eb0796ae278ad64"} err="failed to get container status \"3d8ec2b5a053e2f6f4f7d186ed11258633acc9ba322f90ea5eb0796ae278ad64\": rpc error: code = NotFound desc = could not find container \"3d8ec2b5a053e2f6f4f7d186ed11258633acc9ba322f90ea5eb0796ae278ad64\": container with ID starting with 3d8ec2b5a053e2f6f4f7d186ed11258633acc9ba322f90ea5eb0796ae278ad64 not found: ID does not exist" Apr 16 15:23:00.176412 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.176343 2568 scope.go:117] "RemoveContainer" containerID="d341bced0dd24a96107f88f9147c0765e5dad298ec533f762076b9f092958de8" Apr 16 15:23:00.176606 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:23:00.176584 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d341bced0dd24a96107f88f9147c0765e5dad298ec533f762076b9f092958de8\": container with ID starting with d341bced0dd24a96107f88f9147c0765e5dad298ec533f762076b9f092958de8 not found: ID does not exist" containerID="d341bced0dd24a96107f88f9147c0765e5dad298ec533f762076b9f092958de8" Apr 16 15:23:00.176649 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.176617 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d341bced0dd24a96107f88f9147c0765e5dad298ec533f762076b9f092958de8"} err="failed to get container status \"d341bced0dd24a96107f88f9147c0765e5dad298ec533f762076b9f092958de8\": rpc error: code = NotFound desc = could not find container \"d341bced0dd24a96107f88f9147c0765e5dad298ec533f762076b9f092958de8\": container with ID starting with d341bced0dd24a96107f88f9147c0765e5dad298ec533f762076b9f092958de8 not found: ID does not exist" Apr 16 15:23:00.176649 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.176640 2568 scope.go:117] "RemoveContainer" containerID="2703cacaf3535a20f2c6e9c2746e558b7d7bfcad5d89adfd5611a21c033f95b3" Apr 16 15:23:00.176865 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:23:00.176846 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2703cacaf3535a20f2c6e9c2746e558b7d7bfcad5d89adfd5611a21c033f95b3\": container with ID starting with 2703cacaf3535a20f2c6e9c2746e558b7d7bfcad5d89adfd5611a21c033f95b3 not found: ID does not exist" containerID="2703cacaf3535a20f2c6e9c2746e558b7d7bfcad5d89adfd5611a21c033f95b3" Apr 16 15:23:00.176997 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.176871 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2703cacaf3535a20f2c6e9c2746e558b7d7bfcad5d89adfd5611a21c033f95b3"} err="failed to get container status \"2703cacaf3535a20f2c6e9c2746e558b7d7bfcad5d89adfd5611a21c033f95b3\": rpc error: code = NotFound desc = could not find container \"2703cacaf3535a20f2c6e9c2746e558b7d7bfcad5d89adfd5611a21c033f95b3\": container with ID starting with 2703cacaf3535a20f2c6e9c2746e558b7d7bfcad5d89adfd5611a21c033f95b3 not found: ID does not exist" Apr 16 15:23:00.229369 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.229350 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-kserve-provision-location\") pod \"07d29e91-6707-40cc-a1ee-199b459a7a6e\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " Apr 16 15:23:00.229460 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.229436 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh6wj\" (UniqueName: \"kubernetes.io/projected/07d29e91-6707-40cc-a1ee-199b459a7a6e-kube-api-access-hh6wj\") pod \"07d29e91-6707-40cc-a1ee-199b459a7a6e\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " Apr 16 15:23:00.229460 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.229455 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-uds\") pod \"07d29e91-6707-40cc-a1ee-199b459a7a6e\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " Apr 16 15:23:00.229563 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.229479 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07d29e91-6707-40cc-a1ee-199b459a7a6e-tls-certs\") pod \"07d29e91-6707-40cc-a1ee-199b459a7a6e\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " Apr 16 15:23:00.229563 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.229512 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-tmp\") pod \"07d29e91-6707-40cc-a1ee-199b459a7a6e\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " Apr 16 15:23:00.229563 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.229540 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-cache\") pod \"07d29e91-6707-40cc-a1ee-199b459a7a6e\" (UID: \"07d29e91-6707-40cc-a1ee-199b459a7a6e\") " Apr 16 15:23:00.229868 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.229736 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "07d29e91-6707-40cc-a1ee-199b459a7a6e" (UID: "07d29e91-6707-40cc-a1ee-199b459a7a6e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:00.229868 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.229854 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-uds\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:23:00.230012 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.229878 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "07d29e91-6707-40cc-a1ee-199b459a7a6e" (UID: "07d29e91-6707-40cc-a1ee-199b459a7a6e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:00.230012 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.229918 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "07d29e91-6707-40cc-a1ee-199b459a7a6e" (UID: "07d29e91-6707-40cc-a1ee-199b459a7a6e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:00.230197 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.230179 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "07d29e91-6707-40cc-a1ee-199b459a7a6e" (UID: "07d29e91-6707-40cc-a1ee-199b459a7a6e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:00.231488 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.231469 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d29e91-6707-40cc-a1ee-199b459a7a6e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "07d29e91-6707-40cc-a1ee-199b459a7a6e" (UID: "07d29e91-6707-40cc-a1ee-199b459a7a6e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:23:00.231603 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.231588 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d29e91-6707-40cc-a1ee-199b459a7a6e-kube-api-access-hh6wj" (OuterVolumeSpecName: "kube-api-access-hh6wj") pod "07d29e91-6707-40cc-a1ee-199b459a7a6e" (UID: "07d29e91-6707-40cc-a1ee-199b459a7a6e"). InnerVolumeSpecName "kube-api-access-hh6wj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:23:00.330464 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.330440 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-tmp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:23:00.330464 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.330461 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-tokenizer-cache\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:23:00.330588 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.330471 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07d29e91-6707-40cc-a1ee-199b459a7a6e-kserve-provision-location\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:23:00.330588 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.330481 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hh6wj\" (UniqueName: \"kubernetes.io/projected/07d29e91-6707-40cc-a1ee-199b459a7a6e-kube-api-access-hh6wj\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:23:00.330588 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.330491 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07d29e91-6707-40cc-a1ee-199b459a7a6e-tls-certs\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:23:00.471661 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.471641 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp"] Apr 16 15:23:00.477434 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:00.477411 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevdqcp"] Apr 16 15:23:01.866278 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:01.866251 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d29e91-6707-40cc-a1ee-199b459a7a6e" path="/var/lib/kubelet/pods/07d29e91-6707-40cc-a1ee-199b459a7a6e/volumes" Apr 16 15:23:13.634369 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.634339 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx"] Apr 16 15:23:13.634774 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.634730 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07d29e91-6707-40cc-a1ee-199b459a7a6e" containerName="storage-initializer" Apr 16 15:23:13.634774 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.634741 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d29e91-6707-40cc-a1ee-199b459a7a6e" containerName="storage-initializer" Apr 16 15:23:13.634774 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.634751 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07d29e91-6707-40cc-a1ee-199b459a7a6e" containerName="tokenizer" Apr 16 15:23:13.634774 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.634757 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d29e91-6707-40cc-a1ee-199b459a7a6e" containerName="tokenizer" Apr 16 15:23:13.634774 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.634772 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07d29e91-6707-40cc-a1ee-199b459a7a6e" containerName="main" Apr 16 15:23:13.634969 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.634778 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d29e91-6707-40cc-a1ee-199b459a7a6e" containerName="main" Apr 16 15:23:13.634969 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.634850 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="07d29e91-6707-40cc-a1ee-199b459a7a6e" containerName="tokenizer" Apr 16 15:23:13.634969 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.634859 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="07d29e91-6707-40cc-a1ee-199b459a7a6e" containerName="main" Apr 16 15:23:13.638032 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.638003 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.640975 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.640960 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 15:23:13.641056 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.640990 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-wqgkt\"" Apr 16 15:23:13.648933 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.648913 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx"] Apr 16 15:23:13.738099 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.738073 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.738198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.738103 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.738198 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.738126 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.738276 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.738210 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-892sc\" (UniqueName: \"kubernetes.io/projected/b4491908-7717-4bb0-8aff-2ff0d5ad771a-kube-api-access-892sc\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.738312 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.738288 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.738347 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.738317 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.839057 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.839033 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.839168 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.839062 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.839168 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.839087 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.839168 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.839105 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.839168 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.839135 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.839363 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.839197 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-892sc\" (UniqueName: \"kubernetes.io/projected/b4491908-7717-4bb0-8aff-2ff0d5ad771a-kube-api-access-892sc\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.839427 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.839371 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.839485 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.839426 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.839485 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.839458 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.839607 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.839588 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.841485 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.841466 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.846872 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.846849 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-892sc\" (UniqueName: \"kubernetes.io/projected/b4491908-7717-4bb0-8aff-2ff0d5ad771a-kube-api-access-892sc\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:13.948635 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:13.948576 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:14.074934 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:14.074905 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx"] Apr 16 15:23:14.076390 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:23:14.076364 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4491908_7717_4bb0_8aff_2ff0d5ad771a.slice/crio-3ef59464ba97faa3328c4b4171278f738de288eb926636d591ec68b9f13733e9 WatchSource:0}: Error finding container 3ef59464ba97faa3328c4b4171278f738de288eb926636d591ec68b9f13733e9: Status 404 returned error can't find the container with id 3ef59464ba97faa3328c4b4171278f738de288eb926636d591ec68b9f13733e9 Apr 16 15:23:14.211295 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:14.211229 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" event={"ID":"b4491908-7717-4bb0-8aff-2ff0d5ad771a","Type":"ContainerStarted","Data":"c33dd9770afb42b3d454bf694b2ef7f202fcaa139c0010e8b810e3ec1c003fbd"} Apr 16 15:23:14.211295 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:14.211262 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" event={"ID":"b4491908-7717-4bb0-8aff-2ff0d5ad771a","Type":"ContainerStarted","Data":"3ef59464ba97faa3328c4b4171278f738de288eb926636d591ec68b9f13733e9"} Apr 16 15:23:15.217121 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:15.217042 2568 generic.go:358] "Generic (PLEG): container finished" podID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" containerID="c33dd9770afb42b3d454bf694b2ef7f202fcaa139c0010e8b810e3ec1c003fbd" exitCode=0 Apr 16 15:23:15.217459 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:15.217131 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" event={"ID":"b4491908-7717-4bb0-8aff-2ff0d5ad771a","Type":"ContainerDied","Data":"c33dd9770afb42b3d454bf694b2ef7f202fcaa139c0010e8b810e3ec1c003fbd"} Apr 16 15:23:16.222618 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:16.222577 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" event={"ID":"b4491908-7717-4bb0-8aff-2ff0d5ad771a","Type":"ContainerStarted","Data":"a751d78c9f5bcd33d819a2fd0ee8e58124307b416c39b23f87e2349cf0c04626"} Apr 16 15:23:16.222618 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:16.222622 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" event={"ID":"b4491908-7717-4bb0-8aff-2ff0d5ad771a","Type":"ContainerStarted","Data":"7a073a8af43a20065362cb88063d7091ff8d1491eaf70e888c6f3716e298324e"} Apr 16 15:23:16.223063 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:16.222698 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:16.244983 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:16.244931 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" podStartSLOduration=3.244915193 podStartE2EDuration="3.244915193s" podCreationTimestamp="2026-04-16 15:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:23:16.241492103 +0000 UTC m=+1835.005444227" watchObservedRunningTime="2026-04-16 15:23:16.244915193 +0000 UTC m=+1835.008867303" Apr 16 15:23:17.565357 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:17.565324 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c"] Apr 16 15:23:17.565821 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:17.565731 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" podUID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerName="main" containerID="cri-o://181e0477dab760f75c548d9a42cb1a9c4cb18009414f46f19684c0717f6873f6" gracePeriod=30 Apr 16 15:23:17.565886 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:17.565804 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" podUID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerName="tokenizer" containerID="cri-o://8ca383b04563434244e594490e8058f1c7e66395ca7d31278a9e7b12b49e815a" gracePeriod=30 Apr 16 15:23:18.240371 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.240333 2568 generic.go:358] "Generic (PLEG): container finished" podID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerID="181e0477dab760f75c548d9a42cb1a9c4cb18009414f46f19684c0717f6873f6" exitCode=0 Apr 16 15:23:18.240585 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.240410 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" event={"ID":"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c","Type":"ContainerDied","Data":"181e0477dab760f75c548d9a42cb1a9c4cb18009414f46f19684c0717f6873f6"} Apr 16 15:23:18.337137 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:23:18.337111 2568 logging.go:55] [core] [Channel #663 SubChannel #664]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.61:9003", ServerName: "10.133.0.61:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.61:9003: connect: connection refused" Apr 16 15:23:18.737298 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.737272 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:23:18.885497 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.885461 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-kserve-provision-location\") pod \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " Apr 16 15:23:18.885497 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.885499 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-cache\") pod \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " Apr 16 15:23:18.885718 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.885530 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tls-certs\") pod \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " Apr 16 15:23:18.885718 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.885558 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbpbr\" (UniqueName: \"kubernetes.io/projected/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-kube-api-access-lbpbr\") pod \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " Apr 16 15:23:18.885718 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.885595 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-uds\") pod \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " Apr 16 15:23:18.885718 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.885659 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-tmp\") pod \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\" (UID: \"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c\") " Apr 16 15:23:18.885944 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.885719 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" (UID: "e9e4b2dd-1e61-48dc-aa73-cb6994f4220c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:18.886031 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.886010 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-cache\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:23:18.886084 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.886007 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" (UID: "e9e4b2dd-1e61-48dc-aa73-cb6994f4220c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:18.886299 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.886273 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" (UID: "e9e4b2dd-1e61-48dc-aa73-cb6994f4220c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:18.886544 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.886521 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" (UID: "e9e4b2dd-1e61-48dc-aa73-cb6994f4220c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:18.888209 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.888180 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" (UID: "e9e4b2dd-1e61-48dc-aa73-cb6994f4220c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:23:18.888477 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.888458 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-kube-api-access-lbpbr" (OuterVolumeSpecName: "kube-api-access-lbpbr") pod "e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" (UID: "e9e4b2dd-1e61-48dc-aa73-cb6994f4220c"). InnerVolumeSpecName "kube-api-access-lbpbr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:23:18.987029 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.987004 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-kserve-provision-location\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:23:18.987029 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.987028 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tls-certs\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:23:18.987164 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.987046 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lbpbr\" (UniqueName: \"kubernetes.io/projected/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-kube-api-access-lbpbr\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:23:18.987164 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.987059 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-uds\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:23:18.987164 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:18.987072 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c-tokenizer-tmp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:23:19.247486 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.247411 2568 generic.go:358] "Generic (PLEG): container finished" podID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerID="8ca383b04563434244e594490e8058f1c7e66395ca7d31278a9e7b12b49e815a" exitCode=0 Apr 16 15:23:19.247614 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.247493 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" Apr 16 15:23:19.247614 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.247492 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" event={"ID":"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c","Type":"ContainerDied","Data":"8ca383b04563434244e594490e8058f1c7e66395ca7d31278a9e7b12b49e815a"} Apr 16 15:23:19.247614 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.247591 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" event={"ID":"e9e4b2dd-1e61-48dc-aa73-cb6994f4220c","Type":"ContainerDied","Data":"0d868198c1b7a6bd380e247e63f60675c10f831b6f94ca677ab5f3793a1fb1e1"} Apr 16 15:23:19.247614 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.247608 2568 scope.go:117] "RemoveContainer" containerID="8ca383b04563434244e594490e8058f1c7e66395ca7d31278a9e7b12b49e815a" Apr 16 15:23:19.256913 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.256884 2568 scope.go:117] "RemoveContainer" containerID="181e0477dab760f75c548d9a42cb1a9c4cb18009414f46f19684c0717f6873f6" Apr 16 15:23:19.264742 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.264724 2568 scope.go:117] "RemoveContainer" containerID="17ddc4c837c5fbc063e2835aea7c58e9e953cb6b24f91905e63d98145f46df56" Apr 16 15:23:19.270502 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.270480 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c"] Apr 16 15:23:19.273430 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.273415 2568 scope.go:117] "RemoveContainer" containerID="8ca383b04563434244e594490e8058f1c7e66395ca7d31278a9e7b12b49e815a" Apr 16 15:23:19.273695 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:23:19.273663 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ca383b04563434244e594490e8058f1c7e66395ca7d31278a9e7b12b49e815a\": container with ID starting with 8ca383b04563434244e594490e8058f1c7e66395ca7d31278a9e7b12b49e815a not found: ID does not exist" containerID="8ca383b04563434244e594490e8058f1c7e66395ca7d31278a9e7b12b49e815a" Apr 16 15:23:19.273788 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.273691 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ca383b04563434244e594490e8058f1c7e66395ca7d31278a9e7b12b49e815a"} err="failed to get container status \"8ca383b04563434244e594490e8058f1c7e66395ca7d31278a9e7b12b49e815a\": rpc error: code = NotFound desc = could not find container \"8ca383b04563434244e594490e8058f1c7e66395ca7d31278a9e7b12b49e815a\": container with ID starting with 8ca383b04563434244e594490e8058f1c7e66395ca7d31278a9e7b12b49e815a not found: ID does not exist" Apr 16 15:23:19.273788 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.273706 2568 scope.go:117] "RemoveContainer" containerID="181e0477dab760f75c548d9a42cb1a9c4cb18009414f46f19684c0717f6873f6" Apr 16 15:23:19.273984 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:23:19.273963 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"181e0477dab760f75c548d9a42cb1a9c4cb18009414f46f19684c0717f6873f6\": container with ID starting with 181e0477dab760f75c548d9a42cb1a9c4cb18009414f46f19684c0717f6873f6 not found: ID does not exist" containerID="181e0477dab760f75c548d9a42cb1a9c4cb18009414f46f19684c0717f6873f6" Apr 16 15:23:19.274074 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.273989 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"181e0477dab760f75c548d9a42cb1a9c4cb18009414f46f19684c0717f6873f6"} err="failed to get container status \"181e0477dab760f75c548d9a42cb1a9c4cb18009414f46f19684c0717f6873f6\": rpc error: code = NotFound desc = could not find container \"181e0477dab760f75c548d9a42cb1a9c4cb18009414f46f19684c0717f6873f6\": container with ID starting with 181e0477dab760f75c548d9a42cb1a9c4cb18009414f46f19684c0717f6873f6 not found: ID does not exist" Apr 16 15:23:19.274074 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.274005 2568 scope.go:117] "RemoveContainer" containerID="17ddc4c837c5fbc063e2835aea7c58e9e953cb6b24f91905e63d98145f46df56" Apr 16 15:23:19.274261 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:23:19.274243 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17ddc4c837c5fbc063e2835aea7c58e9e953cb6b24f91905e63d98145f46df56\": container with ID starting with 17ddc4c837c5fbc063e2835aea7c58e9e953cb6b24f91905e63d98145f46df56 not found: ID does not exist" containerID="17ddc4c837c5fbc063e2835aea7c58e9e953cb6b24f91905e63d98145f46df56" Apr 16 15:23:19.274324 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.274266 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17ddc4c837c5fbc063e2835aea7c58e9e953cb6b24f91905e63d98145f46df56"} err="failed to get container status \"17ddc4c837c5fbc063e2835aea7c58e9e953cb6b24f91905e63d98145f46df56\": rpc error: code = NotFound desc = could not find container \"17ddc4c837c5fbc063e2835aea7c58e9e953cb6b24f91905e63d98145f46df56\": container with ID starting with 17ddc4c837c5fbc063e2835aea7c58e9e953cb6b24f91905e63d98145f46df56 not found: ID does not exist" Apr 16 15:23:19.275480 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.275457 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c"] Apr 16 15:23:19.336954 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.336925 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f67xcs7c" podUID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.61:9003\" within 1s: context deadline exceeded" Apr 16 15:23:19.864712 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:19.864679 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" path="/var/lib/kubelet/pods/e9e4b2dd-1e61-48dc-aa73-cb6994f4220c/volumes" Apr 16 15:23:23.949506 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:23.949475 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:23.949506 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:23.949513 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:23.952052 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:23.952028 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:24.269726 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:24.269648 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:39.030871 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.030837 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns"] Apr 16 15:23:39.031364 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.031239 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerName="main" Apr 16 15:23:39.031364 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.031250 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerName="main" Apr 16 15:23:39.031364 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.031261 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerName="tokenizer" Apr 16 15:23:39.031364 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.031266 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerName="tokenizer" Apr 16 15:23:39.031364 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.031282 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerName="storage-initializer" Apr 16 15:23:39.031364 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.031288 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerName="storage-initializer" Apr 16 15:23:39.031364 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.031361 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerName="tokenizer" Apr 16 15:23:39.031600 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.031370 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9e4b2dd-1e61-48dc-aa73-cb6994f4220c" containerName="main" Apr 16 15:23:39.036174 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.036157 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.038658 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.038631 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 15:23:39.038936 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.038916 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-ghgxr\"" Apr 16 15:23:39.043669 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.043645 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns"] Apr 16 15:23:39.146147 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.146122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.146249 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.146159 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.146249 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.146186 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.146249 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.146232 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.146357 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.146265 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmrn2\" (UniqueName: \"kubernetes.io/projected/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-kube-api-access-cmrn2\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.146357 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.146301 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.247076 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.247053 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.247169 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.247092 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.247169 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.247119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.247169 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.247152 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.247308 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.247172 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmrn2\" (UniqueName: \"kubernetes.io/projected/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-kube-api-access-cmrn2\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.247308 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.247195 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.247532 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.247511 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.247532 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.247520 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.247625 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.247561 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.247625 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.247605 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.249772 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.249750 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.254731 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.254706 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmrn2\" (UniqueName: \"kubernetes.io/projected/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-kube-api-access-cmrn2\") pod \"router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.348563 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.348536 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:39.473557 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:39.473529 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns"] Apr 16 15:23:39.474842 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:23:39.474810 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39c7ef6a_d400_49e9_bf74_0a4d5116cc58.slice/crio-62dc5a03b705094bb4ba05353fe4da576841300211e1aefed460e5d284da1079 WatchSource:0}: Error finding container 62dc5a03b705094bb4ba05353fe4da576841300211e1aefed460e5d284da1079: Status 404 returned error can't find the container with id 62dc5a03b705094bb4ba05353fe4da576841300211e1aefed460e5d284da1079 Apr 16 15:23:40.342172 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:40.342135 2568 generic.go:358] "Generic (PLEG): container finished" podID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerID="0e0df542a00a1764934d3c4705fe6376dc281b3f1d3c6e7e2a072236e50ea5a8" exitCode=0 Apr 16 15:23:40.342482 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:40.342219 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" event={"ID":"39c7ef6a-d400-49e9-bf74-0a4d5116cc58","Type":"ContainerDied","Data":"0e0df542a00a1764934d3c4705fe6376dc281b3f1d3c6e7e2a072236e50ea5a8"} Apr 16 15:23:40.342482 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:40.342254 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" event={"ID":"39c7ef6a-d400-49e9-bf74-0a4d5116cc58","Type":"ContainerStarted","Data":"62dc5a03b705094bb4ba05353fe4da576841300211e1aefed460e5d284da1079"} Apr 16 15:23:41.351315 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:41.351278 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" event={"ID":"39c7ef6a-d400-49e9-bf74-0a4d5116cc58","Type":"ContainerStarted","Data":"9fa9fcb7dcb64246252945472bc6c68900d83144381ed4dd1a97fd264837cdd8"} Apr 16 15:23:41.351315 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:41.351313 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" event={"ID":"39c7ef6a-d400-49e9-bf74-0a4d5116cc58","Type":"ContainerStarted","Data":"a168168c2f5b12adb940f651c6095682ad0b9d788db569d7b922e19e366d1b58"} Apr 16 15:23:41.351851 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:41.351363 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:41.373921 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:41.373846 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" podStartSLOduration=2.373826069 podStartE2EDuration="2.373826069s" podCreationTimestamp="2026-04-16 15:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:23:41.36935256 +0000 UTC m=+1860.133304694" watchObservedRunningTime="2026-04-16 15:23:41.373826069 +0000 UTC m=+1860.137778181" Apr 16 15:23:45.274668 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:45.274640 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:23:49.348858 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:49.348825 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:49.349317 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:49.348870 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:49.351609 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:49.351578 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:23:49.383964 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:23:49.383941 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:24:10.388946 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:10.388861 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:24:12.730710 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:12.730674 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx"] Apr 16 15:24:12.731140 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:12.731002 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" podUID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" containerName="main" containerID="cri-o://7a073a8af43a20065362cb88063d7091ff8d1491eaf70e888c6f3716e298324e" gracePeriod=30 Apr 16 15:24:12.731140 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:12.731057 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" podUID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" containerName="tokenizer" containerID="cri-o://a751d78c9f5bcd33d819a2fd0ee8e58124307b416c39b23f87e2349cf0c04626" gracePeriod=30 Apr 16 15:24:13.496397 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.496355 2568 generic.go:358] "Generic (PLEG): container finished" podID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" containerID="7a073a8af43a20065362cb88063d7091ff8d1491eaf70e888c6f3716e298324e" exitCode=0 Apr 16 15:24:13.496578 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.496423 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" event={"ID":"b4491908-7717-4bb0-8aff-2ff0d5ad771a","Type":"ContainerDied","Data":"7a073a8af43a20065362cb88063d7091ff8d1491eaf70e888c6f3716e298324e"} Apr 16 15:24:13.881825 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.881805 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:24:13.923376 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.923346 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-kserve-provision-location\") pod \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " Apr 16 15:24:13.923483 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.923406 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-892sc\" (UniqueName: \"kubernetes.io/projected/b4491908-7717-4bb0-8aff-2ff0d5ad771a-kube-api-access-892sc\") pod \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " Apr 16 15:24:13.923545 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.923527 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-tmp\") pod \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " Apr 16 15:24:13.923601 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.923584 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tls-certs\") pod \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " Apr 16 15:24:13.923703 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.923683 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-cache\") pod \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " Apr 16 15:24:13.923838 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.923719 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-uds\") pod \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\" (UID: \"b4491908-7717-4bb0-8aff-2ff0d5ad771a\") " Apr 16 15:24:13.923927 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.923853 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b4491908-7717-4bb0-8aff-2ff0d5ad771a" (UID: "b4491908-7717-4bb0-8aff-2ff0d5ad771a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:24:13.923927 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.923873 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b4491908-7717-4bb0-8aff-2ff0d5ad771a" (UID: "b4491908-7717-4bb0-8aff-2ff0d5ad771a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:24:13.924059 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.924047 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-tmp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:24:13.924116 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.924067 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-cache\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:24:13.924116 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.924075 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b4491908-7717-4bb0-8aff-2ff0d5ad771a" (UID: "b4491908-7717-4bb0-8aff-2ff0d5ad771a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:24:13.924300 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.924281 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b4491908-7717-4bb0-8aff-2ff0d5ad771a" (UID: "b4491908-7717-4bb0-8aff-2ff0d5ad771a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:24:13.925662 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.925641 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b4491908-7717-4bb0-8aff-2ff0d5ad771a" (UID: "b4491908-7717-4bb0-8aff-2ff0d5ad771a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:24:13.926029 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:13.926005 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4491908-7717-4bb0-8aff-2ff0d5ad771a-kube-api-access-892sc" (OuterVolumeSpecName: "kube-api-access-892sc") pod "b4491908-7717-4bb0-8aff-2ff0d5ad771a" (UID: "b4491908-7717-4bb0-8aff-2ff0d5ad771a"). InnerVolumeSpecName "kube-api-access-892sc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:24:14.025179 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.025131 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tokenizer-uds\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:24:14.025179 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.025152 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4491908-7717-4bb0-8aff-2ff0d5ad771a-kserve-provision-location\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:24:14.025179 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.025162 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-892sc\" (UniqueName: \"kubernetes.io/projected/b4491908-7717-4bb0-8aff-2ff0d5ad771a-kube-api-access-892sc\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:24:14.025179 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.025171 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4491908-7717-4bb0-8aff-2ff0d5ad771a-tls-certs\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:24:14.502400 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.502366 2568 generic.go:358] "Generic (PLEG): container finished" podID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" containerID="a751d78c9f5bcd33d819a2fd0ee8e58124307b416c39b23f87e2349cf0c04626" exitCode=0 Apr 16 15:24:14.502549 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.502420 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" event={"ID":"b4491908-7717-4bb0-8aff-2ff0d5ad771a","Type":"ContainerDied","Data":"a751d78c9f5bcd33d819a2fd0ee8e58124307b416c39b23f87e2349cf0c04626"} Apr 16 15:24:14.502549 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.502444 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" event={"ID":"b4491908-7717-4bb0-8aff-2ff0d5ad771a","Type":"ContainerDied","Data":"3ef59464ba97faa3328c4b4171278f738de288eb926636d591ec68b9f13733e9"} Apr 16 15:24:14.502549 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.502459 2568 scope.go:117] "RemoveContainer" containerID="a751d78c9f5bcd33d819a2fd0ee8e58124307b416c39b23f87e2349cf0c04626" Apr 16 15:24:14.502549 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.502475 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx" Apr 16 15:24:14.515108 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.515079 2568 scope.go:117] "RemoveContainer" containerID="7a073a8af43a20065362cb88063d7091ff8d1491eaf70e888c6f3716e298324e" Apr 16 15:24:14.523670 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.523581 2568 scope.go:117] "RemoveContainer" containerID="c33dd9770afb42b3d454bf694b2ef7f202fcaa139c0010e8b810e3ec1c003fbd" Apr 16 15:24:14.527134 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.527109 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx"] Apr 16 15:24:14.530496 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.530472 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5b59ffw7fx"] Apr 16 15:24:14.531755 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.531741 2568 scope.go:117] "RemoveContainer" containerID="a751d78c9f5bcd33d819a2fd0ee8e58124307b416c39b23f87e2349cf0c04626" Apr 16 15:24:14.532089 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:24:14.532059 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a751d78c9f5bcd33d819a2fd0ee8e58124307b416c39b23f87e2349cf0c04626\": container with ID starting with a751d78c9f5bcd33d819a2fd0ee8e58124307b416c39b23f87e2349cf0c04626 not found: ID does not exist" containerID="a751d78c9f5bcd33d819a2fd0ee8e58124307b416c39b23f87e2349cf0c04626" Apr 16 15:24:14.532160 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.532098 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a751d78c9f5bcd33d819a2fd0ee8e58124307b416c39b23f87e2349cf0c04626"} err="failed to get container status \"a751d78c9f5bcd33d819a2fd0ee8e58124307b416c39b23f87e2349cf0c04626\": rpc error: code = NotFound desc = could not find container \"a751d78c9f5bcd33d819a2fd0ee8e58124307b416c39b23f87e2349cf0c04626\": container with ID starting with a751d78c9f5bcd33d819a2fd0ee8e58124307b416c39b23f87e2349cf0c04626 not found: ID does not exist" Apr 16 15:24:14.532160 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.532116 2568 scope.go:117] "RemoveContainer" containerID="7a073a8af43a20065362cb88063d7091ff8d1491eaf70e888c6f3716e298324e" Apr 16 15:24:14.532354 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:24:14.532334 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a073a8af43a20065362cb88063d7091ff8d1491eaf70e888c6f3716e298324e\": container with ID starting with 7a073a8af43a20065362cb88063d7091ff8d1491eaf70e888c6f3716e298324e not found: ID does not exist" containerID="7a073a8af43a20065362cb88063d7091ff8d1491eaf70e888c6f3716e298324e" Apr 16 15:24:14.532393 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.532360 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a073a8af43a20065362cb88063d7091ff8d1491eaf70e888c6f3716e298324e"} err="failed to get container status \"7a073a8af43a20065362cb88063d7091ff8d1491eaf70e888c6f3716e298324e\": rpc error: code = NotFound desc = could not find container \"7a073a8af43a20065362cb88063d7091ff8d1491eaf70e888c6f3716e298324e\": container with ID starting with 7a073a8af43a20065362cb88063d7091ff8d1491eaf70e888c6f3716e298324e not found: ID does not exist" Apr 16 15:24:14.532393 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.532376 2568 scope.go:117] "RemoveContainer" containerID="c33dd9770afb42b3d454bf694b2ef7f202fcaa139c0010e8b810e3ec1c003fbd" Apr 16 15:24:14.532600 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:24:14.532586 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c33dd9770afb42b3d454bf694b2ef7f202fcaa139c0010e8b810e3ec1c003fbd\": container with ID starting with c33dd9770afb42b3d454bf694b2ef7f202fcaa139c0010e8b810e3ec1c003fbd not found: ID does not exist" containerID="c33dd9770afb42b3d454bf694b2ef7f202fcaa139c0010e8b810e3ec1c003fbd" Apr 16 15:24:14.532641 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:14.532604 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c33dd9770afb42b3d454bf694b2ef7f202fcaa139c0010e8b810e3ec1c003fbd"} err="failed to get container status \"c33dd9770afb42b3d454bf694b2ef7f202fcaa139c0010e8b810e3ec1c003fbd\": rpc error: code = NotFound desc = could not find container \"c33dd9770afb42b3d454bf694b2ef7f202fcaa139c0010e8b810e3ec1c003fbd\": container with ID starting with c33dd9770afb42b3d454bf694b2ef7f202fcaa139c0010e8b810e3ec1c003fbd not found: ID does not exist" Apr 16 15:24:15.863478 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:24:15.863445 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" path="/var/lib/kubelet/pods/b4491908-7717-4bb0-8aff-2ff0d5ad771a/volumes" Apr 16 15:25:30.218777 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:30.218744 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns"] Apr 16 15:25:30.219315 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:30.219053 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" podUID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerName="main" containerID="cri-o://a168168c2f5b12adb940f651c6095682ad0b9d788db569d7b922e19e366d1b58" gracePeriod=30 Apr 16 15:25:30.219315 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:30.219105 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" podUID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerName="tokenizer" containerID="cri-o://9fa9fcb7dcb64246252945472bc6c68900d83144381ed4dd1a97fd264837cdd8" gracePeriod=30 Apr 16 15:25:30.387641 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:25:30.387574 2568 logging.go:55] [core] [Channel #746 SubChannel #747]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.64:9003", ServerName: "10.133.0.64:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.64:9003: connect: connection refused" Apr 16 15:25:30.840196 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:30.840161 2568 generic.go:358] "Generic (PLEG): container finished" podID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerID="a168168c2f5b12adb940f651c6095682ad0b9d788db569d7b922e19e366d1b58" exitCode=0 Apr 16 15:25:30.840366 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:30.840235 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" event={"ID":"39c7ef6a-d400-49e9-bf74-0a4d5116cc58","Type":"ContainerDied","Data":"a168168c2f5b12adb940f651c6095682ad0b9d788db569d7b922e19e366d1b58"} Apr 16 15:25:31.364262 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.364237 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:25:31.387991 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.387956 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" podUID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.64:9003\" within 1s: context deadline exceeded" Apr 16 15:25:31.433384 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.433320 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-tmp\") pod \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " Apr 16 15:25:31.433384 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.433374 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-cache\") pod \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " Apr 16 15:25:31.433532 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.433393 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-kserve-provision-location\") pod \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " Apr 16 15:25:31.433532 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.433421 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-uds\") pod \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " Apr 16 15:25:31.433532 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.433442 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmrn2\" (UniqueName: \"kubernetes.io/projected/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-kube-api-access-cmrn2\") pod \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " Apr 16 15:25:31.433677 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.433554 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tls-certs\") pod \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\" (UID: \"39c7ef6a-d400-49e9-bf74-0a4d5116cc58\") " Apr 16 15:25:31.433735 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.433714 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "39c7ef6a-d400-49e9-bf74-0a4d5116cc58" (UID: "39c7ef6a-d400-49e9-bf74-0a4d5116cc58"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:25:31.433735 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.433724 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "39c7ef6a-d400-49e9-bf74-0a4d5116cc58" (UID: "39c7ef6a-d400-49e9-bf74-0a4d5116cc58"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:25:31.433855 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.433751 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "39c7ef6a-d400-49e9-bf74-0a4d5116cc58" (UID: "39c7ef6a-d400-49e9-bf74-0a4d5116cc58"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:25:31.433970 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.433954 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-cache\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:25:31.434016 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.433973 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-uds\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:25:31.434016 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.433983 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tokenizer-tmp\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:25:31.434230 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.434211 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "39c7ef6a-d400-49e9-bf74-0a4d5116cc58" (UID: "39c7ef6a-d400-49e9-bf74-0a4d5116cc58"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:25:31.435473 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.435446 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-kube-api-access-cmrn2" (OuterVolumeSpecName: "kube-api-access-cmrn2") pod "39c7ef6a-d400-49e9-bf74-0a4d5116cc58" (UID: "39c7ef6a-d400-49e9-bf74-0a4d5116cc58"). InnerVolumeSpecName "kube-api-access-cmrn2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:25:31.435570 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.435538 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "39c7ef6a-d400-49e9-bf74-0a4d5116cc58" (UID: "39c7ef6a-d400-49e9-bf74-0a4d5116cc58"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:25:31.467641 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.467618 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wm547/must-gather-6sskp"] Apr 16 15:25:31.468061 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468046 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerName="main" Apr 16 15:25:31.468061 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468061 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerName="main" Apr 16 15:25:31.468176 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468074 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerName="tokenizer" Apr 16 15:25:31.468176 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468079 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerName="tokenizer" Apr 16 15:25:31.468176 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468099 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" containerName="storage-initializer" Apr 16 15:25:31.468176 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468105 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" containerName="storage-initializer" Apr 16 15:25:31.468176 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468115 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" containerName="tokenizer" Apr 16 15:25:31.468176 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468120 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" containerName="tokenizer" Apr 16 15:25:31.468176 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468127 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerName="storage-initializer" Apr 16 15:25:31.468176 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468132 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerName="storage-initializer" Apr 16 15:25:31.468176 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468138 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" containerName="main" Apr 16 15:25:31.468176 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468143 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" containerName="main" Apr 16 15:25:31.468473 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468200 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerName="main" Apr 16 15:25:31.468473 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468207 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerName="tokenizer" Apr 16 15:25:31.468473 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468215 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" containerName="tokenizer" Apr 16 15:25:31.468473 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.468223 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4491908-7717-4bb0-8aff-2ff0d5ad771a" containerName="main" Apr 16 15:25:31.472487 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.472473 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wm547/must-gather-6sskp" Apr 16 15:25:31.476162 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.476143 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wm547\"/\"default-dockercfg-p7zzj\"" Apr 16 15:25:31.476608 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.476594 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wm547\"/\"kube-root-ca.crt\"" Apr 16 15:25:31.477145 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.477132 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wm547\"/\"openshift-service-ca.crt\"" Apr 16 15:25:31.483377 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.483335 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wm547/must-gather-6sskp"] Apr 16 15:25:31.534379 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.534357 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2szw9\" (UniqueName: \"kubernetes.io/projected/db90d79b-1cd6-434e-852b-8932f3191f9f-kube-api-access-2szw9\") pod \"must-gather-6sskp\" (UID: \"db90d79b-1cd6-434e-852b-8932f3191f9f\") " pod="openshift-must-gather-wm547/must-gather-6sskp" Apr 16 15:25:31.534466 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.534396 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db90d79b-1cd6-434e-852b-8932f3191f9f-must-gather-output\") pod \"must-gather-6sskp\" (UID: \"db90d79b-1cd6-434e-852b-8932f3191f9f\") " pod="openshift-must-gather-wm547/must-gather-6sskp" Apr 16 15:25:31.534548 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.534495 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-kserve-provision-location\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:25:31.534548 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.534511 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cmrn2\" (UniqueName: \"kubernetes.io/projected/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-kube-api-access-cmrn2\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:25:31.534548 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.534520 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/39c7ef6a-d400-49e9-bf74-0a4d5116cc58-tls-certs\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:25:31.635565 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.635545 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db90d79b-1cd6-434e-852b-8932f3191f9f-must-gather-output\") pod \"must-gather-6sskp\" (UID: \"db90d79b-1cd6-434e-852b-8932f3191f9f\") " pod="openshift-must-gather-wm547/must-gather-6sskp" Apr 16 15:25:31.635651 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.635608 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2szw9\" (UniqueName: \"kubernetes.io/projected/db90d79b-1cd6-434e-852b-8932f3191f9f-kube-api-access-2szw9\") pod \"must-gather-6sskp\" (UID: \"db90d79b-1cd6-434e-852b-8932f3191f9f\") " pod="openshift-must-gather-wm547/must-gather-6sskp" Apr 16 15:25:31.635854 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.635832 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db90d79b-1cd6-434e-852b-8932f3191f9f-must-gather-output\") pod \"must-gather-6sskp\" (UID: \"db90d79b-1cd6-434e-852b-8932f3191f9f\") " pod="openshift-must-gather-wm547/must-gather-6sskp" Apr 16 15:25:31.644271 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.644244 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2szw9\" (UniqueName: \"kubernetes.io/projected/db90d79b-1cd6-434e-852b-8932f3191f9f-kube-api-access-2szw9\") pod \"must-gather-6sskp\" (UID: \"db90d79b-1cd6-434e-852b-8932f3191f9f\") " pod="openshift-must-gather-wm547/must-gather-6sskp" Apr 16 15:25:31.781909 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.781835 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wm547/must-gather-6sskp" Apr 16 15:25:31.848485 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.848447 2568 generic.go:358] "Generic (PLEG): container finished" podID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" containerID="9fa9fcb7dcb64246252945472bc6c68900d83144381ed4dd1a97fd264837cdd8" exitCode=0 Apr 16 15:25:31.848605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.848499 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" event={"ID":"39c7ef6a-d400-49e9-bf74-0a4d5116cc58","Type":"ContainerDied","Data":"9fa9fcb7dcb64246252945472bc6c68900d83144381ed4dd1a97fd264837cdd8"} Apr 16 15:25:31.848605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.848531 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" event={"ID":"39c7ef6a-d400-49e9-bf74-0a4d5116cc58","Type":"ContainerDied","Data":"62dc5a03b705094bb4ba05353fe4da576841300211e1aefed460e5d284da1079"} Apr 16 15:25:31.848605 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.848553 2568 scope.go:117] "RemoveContainer" containerID="9fa9fcb7dcb64246252945472bc6c68900d83144381ed4dd1a97fd264837cdd8" Apr 16 15:25:31.848743 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.848729 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns" Apr 16 15:25:31.858053 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.858029 2568 scope.go:117] "RemoveContainer" containerID="a168168c2f5b12adb940f651c6095682ad0b9d788db569d7b922e19e366d1b58" Apr 16 15:25:31.866550 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.866529 2568 scope.go:117] "RemoveContainer" containerID="0e0df542a00a1764934d3c4705fe6376dc281b3f1d3c6e7e2a072236e50ea5a8" Apr 16 15:25:31.875252 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.874921 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns"] Apr 16 15:25:31.878356 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.878333 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-67665c4d5cd2ns"] Apr 16 15:25:31.879020 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.878997 2568 scope.go:117] "RemoveContainer" containerID="9fa9fcb7dcb64246252945472bc6c68900d83144381ed4dd1a97fd264837cdd8" Apr 16 15:25:31.879334 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:25:31.879303 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fa9fcb7dcb64246252945472bc6c68900d83144381ed4dd1a97fd264837cdd8\": container with ID starting with 9fa9fcb7dcb64246252945472bc6c68900d83144381ed4dd1a97fd264837cdd8 not found: ID does not exist" containerID="9fa9fcb7dcb64246252945472bc6c68900d83144381ed4dd1a97fd264837cdd8" Apr 16 15:25:31.879438 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.879333 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa9fcb7dcb64246252945472bc6c68900d83144381ed4dd1a97fd264837cdd8"} err="failed to get container status \"9fa9fcb7dcb64246252945472bc6c68900d83144381ed4dd1a97fd264837cdd8\": rpc error: code = NotFound desc = could not find container \"9fa9fcb7dcb64246252945472bc6c68900d83144381ed4dd1a97fd264837cdd8\": container with ID starting with 9fa9fcb7dcb64246252945472bc6c68900d83144381ed4dd1a97fd264837cdd8 not found: ID does not exist" Apr 16 15:25:31.879438 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.879356 2568 scope.go:117] "RemoveContainer" containerID="a168168c2f5b12adb940f651c6095682ad0b9d788db569d7b922e19e366d1b58" Apr 16 15:25:31.879615 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:25:31.879599 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a168168c2f5b12adb940f651c6095682ad0b9d788db569d7b922e19e366d1b58\": container with ID starting with a168168c2f5b12adb940f651c6095682ad0b9d788db569d7b922e19e366d1b58 not found: ID does not exist" containerID="a168168c2f5b12adb940f651c6095682ad0b9d788db569d7b922e19e366d1b58" Apr 16 15:25:31.879659 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.879621 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a168168c2f5b12adb940f651c6095682ad0b9d788db569d7b922e19e366d1b58"} err="failed to get container status \"a168168c2f5b12adb940f651c6095682ad0b9d788db569d7b922e19e366d1b58\": rpc error: code = NotFound desc = could not find container \"a168168c2f5b12adb940f651c6095682ad0b9d788db569d7b922e19e366d1b58\": container with ID starting with a168168c2f5b12adb940f651c6095682ad0b9d788db569d7b922e19e366d1b58 not found: ID does not exist" Apr 16 15:25:31.879659 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.879637 2568 scope.go:117] "RemoveContainer" containerID="0e0df542a00a1764934d3c4705fe6376dc281b3f1d3c6e7e2a072236e50ea5a8" Apr 16 15:25:31.879884 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:25:31.879861 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e0df542a00a1764934d3c4705fe6376dc281b3f1d3c6e7e2a072236e50ea5a8\": container with ID starting with 0e0df542a00a1764934d3c4705fe6376dc281b3f1d3c6e7e2a072236e50ea5a8 not found: ID does not exist" containerID="0e0df542a00a1764934d3c4705fe6376dc281b3f1d3c6e7e2a072236e50ea5a8" Apr 16 15:25:31.879967 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.879886 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e0df542a00a1764934d3c4705fe6376dc281b3f1d3c6e7e2a072236e50ea5a8"} err="failed to get container status \"0e0df542a00a1764934d3c4705fe6376dc281b3f1d3c6e7e2a072236e50ea5a8\": rpc error: code = NotFound desc = could not find container \"0e0df542a00a1764934d3c4705fe6376dc281b3f1d3c6e7e2a072236e50ea5a8\": container with ID starting with 0e0df542a00a1764934d3c4705fe6376dc281b3f1d3c6e7e2a072236e50ea5a8 not found: ID does not exist" Apr 16 15:25:31.920585 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.920563 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wm547/must-gather-6sskp"] Apr 16 15:25:31.922819 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:25:31.922793 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb90d79b_1cd6_434e_852b_8932f3191f9f.slice/crio-1432c048e3a836118ffbb839d6da95f1ac8febb1611e9fd76444bd07d5447a26 WatchSource:0}: Error finding container 1432c048e3a836118ffbb839d6da95f1ac8febb1611e9fd76444bd07d5447a26: Status 404 returned error can't find the container with id 1432c048e3a836118ffbb839d6da95f1ac8febb1611e9fd76444bd07d5447a26 Apr 16 15:25:31.924435 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:31.924420 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:25:32.859800 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:32.859752 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wm547/must-gather-6sskp" event={"ID":"db90d79b-1cd6-434e-852b-8932f3191f9f","Type":"ContainerStarted","Data":"1432c048e3a836118ffbb839d6da95f1ac8febb1611e9fd76444bd07d5447a26"} Apr 16 15:25:33.864979 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:33.864943 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39c7ef6a-d400-49e9-bf74-0a4d5116cc58" path="/var/lib/kubelet/pods/39c7ef6a-d400-49e9-bf74-0a4d5116cc58/volumes" Apr 16 15:25:36.879236 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:36.879200 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wm547/must-gather-6sskp" event={"ID":"db90d79b-1cd6-434e-852b-8932f3191f9f","Type":"ContainerStarted","Data":"11305b01d593b38d1c941fcdfd2d2c7ca3e39ba27edb293f4f7cb271ad83e94d"} Apr 16 15:25:36.879236 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:36.879235 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wm547/must-gather-6sskp" event={"ID":"db90d79b-1cd6-434e-852b-8932f3191f9f","Type":"ContainerStarted","Data":"dca7c79917b7c18142016b34bf84c4250c7e6c260ffc8cb5bbd1946970e644ea"} Apr 16 15:25:36.895273 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:36.895232 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wm547/must-gather-6sskp" podStartSLOduration=1.396769371 podStartE2EDuration="5.895218357s" podCreationTimestamp="2026-04-16 15:25:31 +0000 UTC" firstStartedPulling="2026-04-16 15:25:31.924545966 +0000 UTC m=+1970.688498055" lastFinishedPulling="2026-04-16 15:25:36.422994949 +0000 UTC m=+1975.186947041" observedRunningTime="2026-04-16 15:25:36.894273526 +0000 UTC m=+1975.658225638" watchObservedRunningTime="2026-04-16 15:25:36.895218357 +0000 UTC m=+1975.659170499" Apr 16 15:25:59.435159 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:59.435130 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-42vf2_c9a9c4f3-e124-47b0-8eb1-04932d882266/discovery/0.log" Apr 16 15:25:59.448774 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:59.448748 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg_9a0632c9-d4c8-48da-aaf3-57cd0552c7f1/istio-proxy/0.log" Apr 16 15:25:59.463768 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:25:59.463749 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-76456bfb6-dhbg6_b0f86a21-4db7-4159-a529-cd2f872688be/router/0.log" Apr 16 15:26:00.223822 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:00.223797 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-42vf2_c9a9c4f3-e124-47b0-8eb1-04932d882266/discovery/0.log" Apr 16 15:26:00.238618 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:00.238594 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg_9a0632c9-d4c8-48da-aaf3-57cd0552c7f1/istio-proxy/0.log" Apr 16 15:26:00.255993 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:00.255968 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-76456bfb6-dhbg6_b0f86a21-4db7-4159-a529-cd2f872688be/router/0.log" Apr 16 15:26:00.987780 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:00.987752 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-c285d_52899b2c-7e83-4a5a-ac27-cda93f1e21e2/authorino/0.log" Apr 16 15:26:01.015786 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:01.015765 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-sd5j6_3c9b6be4-2165-40f4-9a40-28f06b47739f/manager/0.log" Apr 16 15:26:01.064233 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:01.064210 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-j7ztw_4279f8f7-70f0-46cf-9340-bfe3b343847f/manager/0.log" Apr 16 15:26:01.989581 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:01.989519 2568 generic.go:358] "Generic (PLEG): container finished" podID="db90d79b-1cd6-434e-852b-8932f3191f9f" containerID="dca7c79917b7c18142016b34bf84c4250c7e6c260ffc8cb5bbd1946970e644ea" exitCode=0 Apr 16 15:26:01.989956 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:01.989582 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wm547/must-gather-6sskp" event={"ID":"db90d79b-1cd6-434e-852b-8932f3191f9f","Type":"ContainerDied","Data":"dca7c79917b7c18142016b34bf84c4250c7e6c260ffc8cb5bbd1946970e644ea"} Apr 16 15:26:01.989956 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:01.989913 2568 scope.go:117] "RemoveContainer" containerID="dca7c79917b7c18142016b34bf84c4250c7e6c260ffc8cb5bbd1946970e644ea" Apr 16 15:26:02.726094 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:02.726062 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wm547_must-gather-6sskp_db90d79b-1cd6-434e-852b-8932f3191f9f/gather/0.log" Apr 16 15:26:06.304299 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:06.304269 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-2zgf8_38ffecc9-f9c3-4e05-972f-ee25cb7922e6/global-pull-secret-syncer/0.log" Apr 16 15:26:06.452344 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:06.452319 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-cvndk_f6d1b1db-1136-4814-bd98-1f4c0d57bd3a/konnectivity-agent/0.log" Apr 16 15:26:06.516034 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:06.516016 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-83.ec2.internal_66ca7287f809a0a0d2312e14abcec99e/haproxy/0.log" Apr 16 15:26:08.185399 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:08.185369 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wm547/must-gather-6sskp"] Apr 16 15:26:08.185801 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:08.185576 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-wm547/must-gather-6sskp" podUID="db90d79b-1cd6-434e-852b-8932f3191f9f" containerName="copy" containerID="cri-o://11305b01d593b38d1c941fcdfd2d2c7ca3e39ba27edb293f4f7cb271ad83e94d" gracePeriod=2 Apr 16 15:26:08.187596 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:08.187560 2568 status_manager.go:895] "Failed to get status for pod" podUID="db90d79b-1cd6-434e-852b-8932f3191f9f" pod="openshift-must-gather-wm547/must-gather-6sskp" err="pods \"must-gather-6sskp\" is forbidden: User \"system:node:ip-10-0-140-83.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wm547\": no relationship found between node 'ip-10-0-140-83.ec2.internal' and this object" Apr 16 15:26:08.187959 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:08.187937 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wm547/must-gather-6sskp"] Apr 16 15:26:08.424374 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:08.424350 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wm547_must-gather-6sskp_db90d79b-1cd6-434e-852b-8932f3191f9f/copy/0.log" Apr 16 15:26:08.424682 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:08.424669 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wm547/must-gather-6sskp" Apr 16 15:26:08.426965 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:08.426940 2568 status_manager.go:895] "Failed to get status for pod" podUID="db90d79b-1cd6-434e-852b-8932f3191f9f" pod="openshift-must-gather-wm547/must-gather-6sskp" err="pods \"must-gather-6sskp\" is forbidden: User \"system:node:ip-10-0-140-83.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wm547\": no relationship found between node 'ip-10-0-140-83.ec2.internal' and this object" Apr 16 15:26:08.467618 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:08.467567 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2szw9\" (UniqueName: \"kubernetes.io/projected/db90d79b-1cd6-434e-852b-8932f3191f9f-kube-api-access-2szw9\") pod \"db90d79b-1cd6-434e-852b-8932f3191f9f\" (UID: \"db90d79b-1cd6-434e-852b-8932f3191f9f\") " Apr 16 15:26:08.467618 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:08.467614 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db90d79b-1cd6-434e-852b-8932f3191f9f-must-gather-output\") pod \"db90d79b-1cd6-434e-852b-8932f3191f9f\" (UID: \"db90d79b-1cd6-434e-852b-8932f3191f9f\") " Apr 16 15:26:08.469637 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:08.469611 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db90d79b-1cd6-434e-852b-8932f3191f9f-kube-api-access-2szw9" (OuterVolumeSpecName: "kube-api-access-2szw9") pod "db90d79b-1cd6-434e-852b-8932f3191f9f" (UID: "db90d79b-1cd6-434e-852b-8932f3191f9f"). InnerVolumeSpecName "kube-api-access-2szw9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:26:08.473284 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:08.473262 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db90d79b-1cd6-434e-852b-8932f3191f9f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "db90d79b-1cd6-434e-852b-8932f3191f9f" (UID: "db90d79b-1cd6-434e-852b-8932f3191f9f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:26:08.568954 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:08.568932 2568 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db90d79b-1cd6-434e-852b-8932f3191f9f-must-gather-output\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:26:08.569037 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:08.568956 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2szw9\" (UniqueName: \"kubernetes.io/projected/db90d79b-1cd6-434e-852b-8932f3191f9f-kube-api-access-2szw9\") on node \"ip-10-0-140-83.ec2.internal\" DevicePath \"\"" Apr 16 15:26:09.022883 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:09.022858 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wm547_must-gather-6sskp_db90d79b-1cd6-434e-852b-8932f3191f9f/copy/0.log" Apr 16 15:26:09.023334 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:09.023307 2568 generic.go:358] "Generic (PLEG): container finished" podID="db90d79b-1cd6-434e-852b-8932f3191f9f" containerID="11305b01d593b38d1c941fcdfd2d2c7ca3e39ba27edb293f4f7cb271ad83e94d" exitCode=143 Apr 16 15:26:09.023403 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:09.023374 2568 scope.go:117] "RemoveContainer" containerID="11305b01d593b38d1c941fcdfd2d2c7ca3e39ba27edb293f4f7cb271ad83e94d" Apr 16 15:26:09.023499 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:09.023425 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wm547/must-gather-6sskp" Apr 16 15:26:09.025626 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:09.025592 2568 status_manager.go:895] "Failed to get status for pod" podUID="db90d79b-1cd6-434e-852b-8932f3191f9f" pod="openshift-must-gather-wm547/must-gather-6sskp" err="pods \"must-gather-6sskp\" is forbidden: User \"system:node:ip-10-0-140-83.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wm547\": no relationship found between node 'ip-10-0-140-83.ec2.internal' and this object" Apr 16 15:26:09.032981 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:09.032959 2568 scope.go:117] "RemoveContainer" containerID="dca7c79917b7c18142016b34bf84c4250c7e6c260ffc8cb5bbd1946970e644ea" Apr 16 15:26:09.034274 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:09.034248 2568 status_manager.go:895] "Failed to get status for pod" podUID="db90d79b-1cd6-434e-852b-8932f3191f9f" pod="openshift-must-gather-wm547/must-gather-6sskp" err="pods \"must-gather-6sskp\" is forbidden: User \"system:node:ip-10-0-140-83.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wm547\": no relationship found between node 'ip-10-0-140-83.ec2.internal' and this object" Apr 16 15:26:09.047294 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:09.047274 2568 scope.go:117] "RemoveContainer" containerID="11305b01d593b38d1c941fcdfd2d2c7ca3e39ba27edb293f4f7cb271ad83e94d" Apr 16 15:26:09.047608 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:26:09.047581 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11305b01d593b38d1c941fcdfd2d2c7ca3e39ba27edb293f4f7cb271ad83e94d\": container with ID starting with 11305b01d593b38d1c941fcdfd2d2c7ca3e39ba27edb293f4f7cb271ad83e94d not found: ID does not exist" containerID="11305b01d593b38d1c941fcdfd2d2c7ca3e39ba27edb293f4f7cb271ad83e94d" Apr 16 15:26:09.047713 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:09.047615 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11305b01d593b38d1c941fcdfd2d2c7ca3e39ba27edb293f4f7cb271ad83e94d"} err="failed to get container status \"11305b01d593b38d1c941fcdfd2d2c7ca3e39ba27edb293f4f7cb271ad83e94d\": rpc error: code = NotFound desc = could not find container \"11305b01d593b38d1c941fcdfd2d2c7ca3e39ba27edb293f4f7cb271ad83e94d\": container with ID starting with 11305b01d593b38d1c941fcdfd2d2c7ca3e39ba27edb293f4f7cb271ad83e94d not found: ID does not exist" Apr 16 15:26:09.047713 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:09.047636 2568 scope.go:117] "RemoveContainer" containerID="dca7c79917b7c18142016b34bf84c4250c7e6c260ffc8cb5bbd1946970e644ea" Apr 16 15:26:09.047959 ip-10-0-140-83 kubenswrapper[2568]: E0416 15:26:09.047936 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca7c79917b7c18142016b34bf84c4250c7e6c260ffc8cb5bbd1946970e644ea\": container with ID starting with dca7c79917b7c18142016b34bf84c4250c7e6c260ffc8cb5bbd1946970e644ea not found: ID does not exist" containerID="dca7c79917b7c18142016b34bf84c4250c7e6c260ffc8cb5bbd1946970e644ea" Apr 16 15:26:09.048115 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:09.047971 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca7c79917b7c18142016b34bf84c4250c7e6c260ffc8cb5bbd1946970e644ea"} err="failed to get container status \"dca7c79917b7c18142016b34bf84c4250c7e6c260ffc8cb5bbd1946970e644ea\": rpc error: code = NotFound desc = could not find container \"dca7c79917b7c18142016b34bf84c4250c7e6c260ffc8cb5bbd1946970e644ea\": container with ID starting with dca7c79917b7c18142016b34bf84c4250c7e6c260ffc8cb5bbd1946970e644ea not found: ID does not exist" Apr 16 15:26:09.864945 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:09.864880 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db90d79b-1cd6-434e-852b-8932f3191f9f" path="/var/lib/kubelet/pods/db90d79b-1cd6-434e-852b-8932f3191f9f/volumes" Apr 16 15:26:10.653183 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:10.653156 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-c285d_52899b2c-7e83-4a5a-ac27-cda93f1e21e2/authorino/0.log" Apr 16 15:26:10.704390 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:10.704364 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-sd5j6_3c9b6be4-2165-40f4-9a40-28f06b47739f/manager/0.log" Apr 16 15:26:10.776138 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:10.776116 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-j7ztw_4279f8f7-70f0-46cf-9340-bfe3b343847f/manager/0.log" Apr 16 15:26:11.702542 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:11.702518 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9c73095c-fd5b-46c8-9ad4-b0323de86a2b/alertmanager/0.log" Apr 16 15:26:11.730760 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:11.730736 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9c73095c-fd5b-46c8-9ad4-b0323de86a2b/config-reloader/0.log" Apr 16 15:26:11.748660 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:11.748626 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9c73095c-fd5b-46c8-9ad4-b0323de86a2b/kube-rbac-proxy-web/0.log" Apr 16 15:26:11.767318 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:11.767281 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9c73095c-fd5b-46c8-9ad4-b0323de86a2b/kube-rbac-proxy/0.log" Apr 16 15:26:11.791361 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:11.791318 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9c73095c-fd5b-46c8-9ad4-b0323de86a2b/kube-rbac-proxy-metric/0.log" Apr 16 15:26:11.813108 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:11.813088 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9c73095c-fd5b-46c8-9ad4-b0323de86a2b/prom-label-proxy/0.log" Apr 16 15:26:11.829275 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:11.829257 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9c73095c-fd5b-46c8-9ad4-b0323de86a2b/init-config-reloader/0.log" Apr 16 15:26:11.945870 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:11.945848 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-56dd544798-bkd2p_aa3d4a56-7da8-4220-bf7c-17327a756284/metrics-server/0.log" Apr 16 15:26:11.967679 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:11.967624 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-vgksx_da5dbec6-5b51-4f5a-85cf-f08dd3b4ff45/monitoring-plugin/0.log" Apr 16 15:26:12.146465 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.146442 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rt8hd_78fd3b73-6d88-4569-acd8-b197add6650c/node-exporter/0.log" Apr 16 15:26:12.163615 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.163598 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rt8hd_78fd3b73-6d88-4569-acd8-b197add6650c/kube-rbac-proxy/0.log" Apr 16 15:26:12.181463 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.181448 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rt8hd_78fd3b73-6d88-4569-acd8-b197add6650c/init-textfile/0.log" Apr 16 15:26:12.208141 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.208119 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-cnx9g_788fd48a-507a-4f66-8d27-4ea0596db9e1/kube-rbac-proxy-main/0.log" Apr 16 15:26:12.226482 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.226432 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-cnx9g_788fd48a-507a-4f66-8d27-4ea0596db9e1/kube-rbac-proxy-self/0.log" Apr 16 15:26:12.248412 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.248389 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-cnx9g_788fd48a-507a-4f66-8d27-4ea0596db9e1/openshift-state-metrics/0.log" Apr 16 15:26:12.542886 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.542865 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-cpkz7_bbfb3dcf-c75d-40a8-8f0c-d34ee3c47a06/prometheus-operator-admission-webhook/0.log" Apr 16 15:26:12.585689 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.585661 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d77499596-rfjtq_834dd624-de28-453c-bec0-8f6bf779e2c4/telemeter-client/0.log" Apr 16 15:26:12.615762 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.615743 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d77499596-rfjtq_834dd624-de28-453c-bec0-8f6bf779e2c4/reload/0.log" Apr 16 15:26:12.645219 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.645186 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d77499596-rfjtq_834dd624-de28-453c-bec0-8f6bf779e2c4/kube-rbac-proxy/0.log" Apr 16 15:26:12.680256 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.680238 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f99fc98d-tlm2q_6dd7fb5d-2333-471c-b3d2-1e65d5c362aa/thanos-query/0.log" Apr 16 15:26:12.706681 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.706660 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f99fc98d-tlm2q_6dd7fb5d-2333-471c-b3d2-1e65d5c362aa/kube-rbac-proxy-web/0.log" Apr 16 15:26:12.725187 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.725170 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f99fc98d-tlm2q_6dd7fb5d-2333-471c-b3d2-1e65d5c362aa/kube-rbac-proxy/0.log" Apr 16 15:26:12.743601 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.743583 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f99fc98d-tlm2q_6dd7fb5d-2333-471c-b3d2-1e65d5c362aa/prom-label-proxy/0.log" Apr 16 15:26:12.766065 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.766019 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f99fc98d-tlm2q_6dd7fb5d-2333-471c-b3d2-1e65d5c362aa/kube-rbac-proxy-rules/0.log" Apr 16 15:26:12.786524 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:12.786509 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f99fc98d-tlm2q_6dd7fb5d-2333-471c-b3d2-1e65d5c362aa/kube-rbac-proxy-metrics/0.log" Apr 16 15:26:15.090098 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.090071 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f5976dc55-sg285_a1bd51ae-9001-494f-85cb-1225af7bb1e9/console/0.log" Apr 16 15:26:15.511968 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.511884 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g"] Apr 16 15:26:15.512323 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.512311 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db90d79b-1cd6-434e-852b-8932f3191f9f" containerName="copy" Apr 16 15:26:15.512373 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.512325 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="db90d79b-1cd6-434e-852b-8932f3191f9f" containerName="copy" Apr 16 15:26:15.512373 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.512341 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db90d79b-1cd6-434e-852b-8932f3191f9f" containerName="gather" Apr 16 15:26:15.512373 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.512347 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="db90d79b-1cd6-434e-852b-8932f3191f9f" containerName="gather" Apr 16 15:26:15.512466 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.512401 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="db90d79b-1cd6-434e-852b-8932f3191f9f" containerName="gather" Apr 16 15:26:15.512466 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.512409 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="db90d79b-1cd6-434e-852b-8932f3191f9f" containerName="copy" Apr 16 15:26:15.515457 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.515441 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.518077 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.518051 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f5p4n\"/\"kube-root-ca.crt\"" Apr 16 15:26:15.518077 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.518068 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f5p4n\"/\"openshift-service-ca.crt\"" Apr 16 15:26:15.518947 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.518931 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-f5p4n\"/\"default-dockercfg-fv9wn\"" Apr 16 15:26:15.524500 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.524474 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g"] Apr 16 15:26:15.622140 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.622115 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-lib-modules\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.622239 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.622156 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-podres\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.622239 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.622198 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77g8n\" (UniqueName: \"kubernetes.io/projected/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-kube-api-access-77g8n\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.622239 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.622226 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-proc\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.622407 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.622318 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-sys\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.723654 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.723629 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-podres\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.723755 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.723669 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77g8n\" (UniqueName: \"kubernetes.io/projected/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-kube-api-access-77g8n\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.723755 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.723695 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-proc\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.723865 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.723757 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-sys\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.723865 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.723792 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-podres\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.723865 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.723797 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-lib-modules\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.723865 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.723828 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-proc\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.724074 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.723875 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-sys\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.724074 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.723876 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-lib-modules\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.730994 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.730977 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77g8n\" (UniqueName: \"kubernetes.io/projected/eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c-kube-api-access-77g8n\") pod \"perf-node-gather-daemonset-k274g\" (UID: \"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.826536 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.826518 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:15.945659 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:15.945630 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g"] Apr 16 15:26:15.947565 ip-10-0-140-83 kubenswrapper[2568]: W0416 15:26:15.947535 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podeef478c7_6a9d_4a9c_97f3_7c745ad9bd2c.slice/crio-58232dad4daea49c4d9c5f280792f4df2649a1778f3ef32571109e92e59455b9 WatchSource:0}: Error finding container 58232dad4daea49c4d9c5f280792f4df2649a1778f3ef32571109e92e59455b9: Status 404 returned error can't find the container with id 58232dad4daea49c4d9c5f280792f4df2649a1778f3ef32571109e92e59455b9 Apr 16 15:26:16.053269 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:16.053246 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" event={"ID":"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c","Type":"ContainerStarted","Data":"8c9b52505842019b0b774224bc9b33386176ad85e2a412568d61fe4b82c340e7"} Apr 16 15:26:16.053380 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:16.053279 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" event={"ID":"eef478c7-6a9d-4a9c-97f3-7c745ad9bd2c","Type":"ContainerStarted","Data":"58232dad4daea49c4d9c5f280792f4df2649a1778f3ef32571109e92e59455b9"} Apr 16 15:26:16.053436 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:16.053409 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:16.070522 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:16.070481 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" podStartSLOduration=1.07046803 podStartE2EDuration="1.07046803s" podCreationTimestamp="2026-04-16 15:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:26:16.068741423 +0000 UTC m=+2014.832693537" watchObservedRunningTime="2026-04-16 15:26:16.07046803 +0000 UTC m=+2014.834420141" Apr 16 15:26:16.333064 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:16.333046 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-r8fjc_8e918a1e-6b74-48f2-a4ef-6164f643d8d7/dns/0.log" Apr 16 15:26:16.350238 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:16.350219 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-r8fjc_8e918a1e-6b74-48f2-a4ef-6164f643d8d7/kube-rbac-proxy/0.log" Apr 16 15:26:16.466512 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:16.466491 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s956z_f1f8e071-c6dd-48e5-b5d3-e096d8a24e92/dns-node-resolver/0.log" Apr 16 15:26:16.943515 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:16.943493 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tp4rc_927adbe7-9da4-4d4c-9bd5-3f36e9ef8978/node-ca/0.log" Apr 16 15:26:17.722468 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:17.722441 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-42vf2_c9a9c4f3-e124-47b0-8eb1-04932d882266/discovery/0.log" Apr 16 15:26:17.746729 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:17.746708 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7d888b9fbd-s4rlg_9a0632c9-d4c8-48da-aaf3-57cd0552c7f1/istio-proxy/0.log" Apr 16 15:26:17.770499 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:17.770476 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-76456bfb6-dhbg6_b0f86a21-4db7-4159-a529-cd2f872688be/router/0.log" Apr 16 15:26:18.199400 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:18.199375 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5drdr_559b0245-445d-420e-9f45-367de89eecc7/serve-healthcheck-canary/0.log" Apr 16 15:26:18.705898 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:18.705873 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-grl9h_d8bcd964-3f59-4080-bc40-db541883241c/kube-rbac-proxy/0.log" Apr 16 15:26:18.723709 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:18.723685 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-grl9h_d8bcd964-3f59-4080-bc40-db541883241c/exporter/0.log" Apr 16 15:26:18.742875 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:18.742849 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-grl9h_d8bcd964-3f59-4080-bc40-db541883241c/extractor/0.log" Apr 16 15:26:21.332782 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:21.332753 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-x2x46_f34243be-0754-4e19-9c0d-be3cddfac28c/openshift-lws-operator/0.log" Apr 16 15:26:21.823290 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:21.823260 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7669bdc57-q9lnq_ee07ca6c-8ecb-4821-ba05-70958800051a/manager/0.log" Apr 16 15:26:21.888095 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:21.888074 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-vm4rk_0bc23e19-9195-4eb3-bed2-e4d34efec7ed/server/0.log" Apr 16 15:26:22.070333 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:22.070309 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-k274g" Apr 16 15:26:22.320588 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:22.320564 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-xprxk_74ac6b54-28a0-4446-a105-8ef0e061898c/seaweedfs/0.log" Apr 16 15:26:26.626041 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:26.625986 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-n77nq_83beaa50-9f28-4ab9-8f66-e8b9202b7e30/migrator/0.log" Apr 16 15:26:26.644765 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:26.644747 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-n77nq_83beaa50-9f28-4ab9-8f66-e8b9202b7e30/graceful-termination/0.log" Apr 16 15:26:28.022505 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:28.022437 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ch67_4148d88d-fea6-4539-8c24-81aff5a953f7/kube-multus-additional-cni-plugins/0.log" Apr 16 15:26:28.041577 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:28.041558 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ch67_4148d88d-fea6-4539-8c24-81aff5a953f7/egress-router-binary-copy/0.log" Apr 16 15:26:28.059675 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:28.059658 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ch67_4148d88d-fea6-4539-8c24-81aff5a953f7/cni-plugins/0.log" Apr 16 15:26:28.077136 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:28.077120 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ch67_4148d88d-fea6-4539-8c24-81aff5a953f7/bond-cni-plugin/0.log" Apr 16 15:26:28.096844 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:28.096822 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ch67_4148d88d-fea6-4539-8c24-81aff5a953f7/routeoverride-cni/0.log" Apr 16 15:26:28.114829 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:28.114813 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ch67_4148d88d-fea6-4539-8c24-81aff5a953f7/whereabouts-cni-bincopy/0.log" Apr 16 15:26:28.133887 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:28.133871 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ch67_4148d88d-fea6-4539-8c24-81aff5a953f7/whereabouts-cni/0.log" Apr 16 15:26:28.494970 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:28.494949 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jscc7_dd8296f9-c225-4ca9-8f12-e3aa03b02f50/kube-multus/0.log" Apr 16 15:26:28.554009 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:28.553985 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-twkfq_b932e53d-9993-47b8-a2cb-940fc759370d/network-metrics-daemon/0.log" Apr 16 15:26:28.569336 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:28.569308 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-twkfq_b932e53d-9993-47b8-a2cb-940fc759370d/kube-rbac-proxy/0.log" Apr 16 15:26:29.359116 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:29.359082 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-controller/0.log" Apr 16 15:26:29.373762 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:29.373739 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/0.log" Apr 16 15:26:29.388581 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:29.388558 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovn-acl-logging/1.log" Apr 16 15:26:29.412422 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:29.412405 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/kube-rbac-proxy-node/0.log" Apr 16 15:26:29.434470 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:29.434447 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:26:29.451584 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:29.451566 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/northd/0.log" Apr 16 15:26:29.469346 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:29.469321 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/nbdb/0.log" Apr 16 15:26:29.488971 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:29.488948 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/sbdb/0.log" Apr 16 15:26:29.590644 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:29.590617 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bff4l_4e495dc8-7e20-4024-b33a-c9b6c6c1291f/ovnkube-controller/0.log" Apr 16 15:26:31.265185 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:31.265155 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-vq42l_d2f3e8a7-9701-48e6-bdb2-7748ac1be0d1/check-endpoints/0.log" Apr 16 15:26:31.313876 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:31.313847 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-wr4kj_d293eadf-7dbb-4770-b547-d28be07dbdf1/network-check-target-container/0.log" Apr 16 15:26:32.385432 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:32.385405 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-cbb9d_4fc0480c-be78-4a13-8001-9c955eec95e1/iptables-alerter/0.log" Apr 16 15:26:33.166330 ip-10-0-140-83 kubenswrapper[2568]: I0416 15:26:33.166300 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bp89w_04361303-4b68-4a7e-b73e-a4329bb6bb65/tuned/0.log"