Apr 16 20:11:48.443164 ip-10-0-138-62 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:11:48.871417 ip-10-0-138-62 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:48.871417 ip-10-0-138-62 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:11:48.871417 ip-10-0-138-62 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:48.871417 ip-10-0-138-62 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:11:48.871417 ip-10-0-138-62 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:48.874086 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.874001 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:11:48.882637 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882610 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:48.882637 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882629 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:48.882637 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882633 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:48.882637 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882637 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:48.882637 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882640 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:48.882637 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882643 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:48.882637 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882646 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882649 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882653 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882656 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882659 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882661 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882664 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882667 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882670 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882672 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882675 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882677 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882680 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882683 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882685 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882688 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882690 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882693 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882695 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882701 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:48.882906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882704 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882706 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882709 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882711 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882714 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882716 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882719 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882721 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882724 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882727 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882730 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882732 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882735 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882737 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882740 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882743 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882745 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882748 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882750 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882753 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:48.883385 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882755 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882758 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882761 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882763 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882766 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882769 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882772 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882774 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882776 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882779 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882781 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882784 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882787 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882789 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882792 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882794 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882797 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882799 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882802 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882804 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:48.883892 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882806 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882809 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882811 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882816 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882819 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882822 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882825 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882828 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882831 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882833 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882836 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882838 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882842 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882844 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882846 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882849 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882853 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882857 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882860 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:48.884368 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.882863 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884446 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884454 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884458 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884461 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884464 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884467 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884470 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884473 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884476 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884479 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884481 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884484 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884486 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884489 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884491 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884494 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884496 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884499 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:48.884840 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884501 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884504 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884506 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884509 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884511 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884513 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884517 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884520 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884522 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884524 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884527 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884529 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884532 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884534 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884537 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884540 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884544 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884546 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884549 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:48.885320 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884551 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884554 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884557 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884559 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884562 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884564 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884567 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884570 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884572 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884574 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884577 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884580 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884582 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884585 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884587 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884590 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884593 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884596 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884599 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884601 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:48.885823 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884606 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884609 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884613 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884615 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884618 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884620 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884623 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884625 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884629 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884631 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884634 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884636 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884639 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884643 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884646 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884650 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884653 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884656 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884658 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884661 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:48.886334 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884663 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884666 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884669 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884671 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884674 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884676 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884679 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884681 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.884684 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884764 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884775 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884784 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884791 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884802 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884806 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884810 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884815 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884818 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884821 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884825 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884828 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884831 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:11:48.886831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884834 2572 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884837 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884841 2572 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884844 2572 flags.go:64] FLAG: --cloud-config="" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884847 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884850 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884854 2572 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884857 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884861 2572 flags.go:64] FLAG: --config-dir="" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884864 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884867 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884892 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884895 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884898 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884902 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884905 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884908 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884911 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884914 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884917 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884921 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884924 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884927 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884931 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884935 2572 flags.go:64] FLAG: --enable-server="true" Apr 16 20:11:48.887402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884938 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884943 2572 flags.go:64] FLAG: --event-burst="100" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884946 2572 flags.go:64] FLAG: --event-qps="50" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884949 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884952 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884955 2572 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884960 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884963 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884966 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884969 2572 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884972 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884975 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884977 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884980 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884983 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884986 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884989 2572 flags.go:64] FLAG: --feature-gates="" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884993 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884996 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.884999 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885002 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885005 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885009 2572 flags.go:64] FLAG: --help="false" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885011 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-138-62.ec2.internal" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885014 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:11:48.888017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885018 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885021 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885024 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885027 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885030 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885033 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885036 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885041 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885044 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885047 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885049 2572 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885052 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885055 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885059 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885062 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885064 2572 flags.go:64] FLAG: --lock-file="" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885067 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885070 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885073 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885079 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885081 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885084 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885087 2572 flags.go:64] FLAG: --logging-format="text" Apr 16 20:11:48.888610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885090 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885093 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885096 2572 flags.go:64] FLAG: --manifest-url="" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885099 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885103 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885106 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885110 2572 flags.go:64] FLAG: --max-pods="110" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885113 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885116 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885119 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885122 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885125 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885128 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885131 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885140 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885143 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885146 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885150 2572 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885153 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885158 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885161 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885164 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885167 2572 flags.go:64] FLAG: --port="10250" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885170 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:11:48.889194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885173 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03dd3d1e77e366eac" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885177 2572 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885180 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885183 2572 flags.go:64] FLAG: --register-node="true" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885186 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885189 2572 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885192 2572 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885195 2572 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885198 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885201 2572 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885204 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885207 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885210 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885213 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885216 2572 flags.go:64] FLAG: --runonce="false" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885219 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885222 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885224 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885227 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885230 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885237 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885240 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885243 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885246 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885249 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885252 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:11:48.889762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885256 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885259 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885262 2572 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885265 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885271 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885274 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885277 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885281 2572 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885283 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885286 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885289 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885292 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885295 2572 flags.go:64] FLAG: --v="2" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885300 2572 flags.go:64] FLAG: --version="false" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885304 2572 flags.go:64] FLAG: --vmodule="" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885308 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.885312 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885404 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885408 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885411 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885414 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885419 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885422 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:48.890432 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885425 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885428 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885431 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885435 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885437 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885440 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885443 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885445 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885448 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885451 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885453 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885456 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885459 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885464 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885466 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885469 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885472 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885474 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885477 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885479 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:48.891105 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885481 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885484 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885488 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885491 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885494 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885497 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885499 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885502 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885505 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885507 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885510 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885512 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885515 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885517 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885520 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885523 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885526 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885528 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885531 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885534 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:48.891630 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885536 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885539 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885541 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885545 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885547 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885551 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885553 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885556 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885558 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885561 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885563 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885566 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885568 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885570 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885573 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885575 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885578 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885580 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885583 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885586 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:48.892190 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885588 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885590 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885593 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885595 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885598 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885600 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885603 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885607 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885609 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885611 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885614 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885617 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885620 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885622 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885625 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885627 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885631 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885634 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885637 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:48.892681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.885640 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.886223 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.892615 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.892631 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892678 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892683 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892687 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892690 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892694 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892697 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892700 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892703 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892707 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892709 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892712 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892715 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:48.893163 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892717 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892720 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892722 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892725 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892727 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892730 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892733 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892736 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892738 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892741 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892743 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892746 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892749 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892752 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892755 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892758 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892760 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892763 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892765 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892769 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:48.893553 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892772 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892775 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892777 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892780 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892783 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892785 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892788 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892791 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892795 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892798 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892801 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892804 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892806 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892809 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892811 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892814 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892817 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892820 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892822 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:48.894102 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892824 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892827 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892829 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892832 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892835 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892837 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892841 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892846 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892849 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892852 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892855 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892857 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892860 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892863 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892865 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892882 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892885 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892888 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892891 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892893 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:48.894565 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892896 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892898 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892901 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892904 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892906 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892909 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892911 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892914 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892916 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892919 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892922 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892924 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892927 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892929 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.892932 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:48.895078 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.892937 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893030 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893033 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893036 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893039 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893042 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893045 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893048 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893051 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893054 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893057 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893060 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893063 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893065 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893068 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893071 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893074 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893077 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893079 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:48.895445 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893082 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893084 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893087 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893089 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893092 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893094 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893097 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893100 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893103 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893105 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893107 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893110 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893112 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893115 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893117 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893120 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893122 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893125 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893127 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893130 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:48.896008 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893133 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893135 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893139 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893141 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893144 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893146 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893149 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893153 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893155 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893158 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893160 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893164 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893166 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893169 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893172 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893174 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893177 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893181 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893185 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:48.896497 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893187 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893190 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893193 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893196 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893199 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893202 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893205 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893208 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893210 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893212 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893215 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893217 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893220 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893223 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893225 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893229 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893231 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893234 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893237 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893240 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:48.897001 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893242 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893245 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893247 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893250 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893252 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893255 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893257 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893260 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:48.893262 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.893267 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.893925 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.896498 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.897327 2572 server.go:1019] "Starting client certificate rotation" Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.897416 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:48.897528 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.897453 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:48.921362 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.921344 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:48.926139 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.926117 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:48.942964 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.942926 2572 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:11:48.949621 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.949607 2572 log.go:25] "Validated CRI v1 image API" Apr 16 20:11:48.949714 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.949687 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:48.951233 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.951216 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:11:48.955743 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.955720 2572 fs.go:135] Filesystem UUIDs: map[0670cc68-6af6-41fc-8d8c-032c4335a5e0:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 cec15407-3bc8-4d2a-8ff5-2f7778291b15:/dev/nvme0n1p4] Apr 16 20:11:48.955803 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.955743 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:11:48.961271 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.961157 2572 manager.go:217] Machine: {Timestamp:2026-04-16 20:11:48.959386368 +0000 UTC m=+0.396374867 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100559 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec216ecc89427d4744e81ee3ebd8e5b5 SystemUUID:ec216ecc-8942-7d47-44e8-1ee3ebd8e5b5 BootID:c4fd33ec-3212-4c26-b1da-d72ecd75063d Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:76:7b:36:98:d3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:76:7b:36:98:d3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2a:45:0d:57:d6:a5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:11:48.961271 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.961267 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:11:48.961402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.961386 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:11:48.962398 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.962371 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:11:48.962529 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.962397 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-62.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:11:48.962581 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.962538 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:11:48.962581 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.962546 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:11:48.962581 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.962560 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:48.963510 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.963500 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:48.964604 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.964595 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:48.964715 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.964706 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:11:48.966697 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.966682 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pb4fc" Apr 16 20:11:48.966751 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.966743 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:11:48.966785 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.966756 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:11:48.966785 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.966770 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:11:48.966785 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.966778 2572 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:11:48.966911 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.966786 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:11:48.968136 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.968123 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:48.968183 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.968143 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:48.970852 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.970836 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:11:48.972019 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.972004 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:11:48.972098 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.972079 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pb4fc" Apr 16 20:11:48.973552 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.973538 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:11:48.973552 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.973555 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:11:48.973649 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.973561 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:11:48.973649 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.973568 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:11:48.973649 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.973574 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:11:48.973649 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.973579 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:11:48.973649 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.973585 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:11:48.973649 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.973590 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:11:48.973649 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.973609 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:11:48.973649 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.973616 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:11:48.973649 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.973632 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:11:48.973649 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.973641 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:11:48.975128 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.975116 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:11:48.975128 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.975127 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:11:48.978427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.978410 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:48.978663 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.978649 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:11:48.978729 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.978688 2572 server.go:1295] "Started kubelet" Apr 16 20:11:48.978827 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.978801 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:11:48.978895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.978816 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:11:48.978895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.978863 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:11:48.979399 ip-10-0-138-62 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:11:48.980618 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.980600 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:48.983670 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.983647 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:11:48.984013 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.983999 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:11:48.986577 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.986548 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-62.ec2.internal" not found Apr 16 20:11:48.988595 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.988576 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:11:48.988595 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.988588 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:48.989895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.989295 2572 factory.go:55] Registering systemd factory Apr 16 20:11:48.989895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.989373 2572 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:11:48.989895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.989584 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:11:48.989895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.989603 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:11:48.989895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.989628 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:11:48.989895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.989713 2572 factory.go:153] Registering CRI-O factory Apr 16 20:11:48.989895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.989752 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:11:48.989895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.989762 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:11:48.989895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.989774 2572 factory.go:223] Registration of the crio container factory successfully Apr 16 20:11:48.989895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.989820 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:11:48.989895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.989844 2572 factory.go:103] Registering Raw factory Apr 16 20:11:48.989895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.989857 2572 manager.go:1196] Started watching for new ooms in manager Apr 16 20:11:48.990458 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:48.989938 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-62.ec2.internal\" not found" Apr 16 20:11:48.990458 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.990329 2572 manager.go:319] Starting recovery of all containers Apr 16 20:11:48.991039 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.991021 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:48.991221 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:48.991205 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:11:48.996251 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:48.996081 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-62.ec2.internal\" not found" node="ip-10-0-138-62.ec2.internal" Apr 16 20:11:48.999807 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:48.999793 2572 manager.go:324] Recovery completed Apr 16 20:11:49.001363 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.001346 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-62.ec2.internal" not found Apr 16 20:11:49.003671 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.003658 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:49.005751 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.005736 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-62.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:49.005821 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.005761 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:49.005821 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.005772 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-62.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:49.006248 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.006236 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:11:49.006248 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.006246 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:11:49.006341 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.006264 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:49.008689 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.008677 2572 policy_none.go:49] "None policy: Start" Apr 16 20:11:49.008732 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.008693 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:11:49.008732 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.008703 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:11:49.043466 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.043451 2572 manager.go:341] "Starting Device Plugin manager" Apr 16 20:11:49.043550 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:49.043478 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:11:49.043550 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.043487 2572 server.go:85] "Starting device plugin registration server" Apr 16 20:11:49.043732 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.043718 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:11:49.043807 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.043734 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:11:49.043862 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.043807 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:11:49.043935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.043905 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:11:49.043935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.043916 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:11:49.044429 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:49.044409 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:11:49.044527 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:49.044450 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-62.ec2.internal\" not found" Apr 16 20:11:49.058006 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.057991 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-62.ec2.internal" not found Apr 16 20:11:49.142554 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.142505 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:11:49.143739 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.143719 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:11:49.143818 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.143743 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:11:49.143818 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.143758 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:11:49.143818 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.143765 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:11:49.143818 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:49.143796 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:11:49.144025 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.143826 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:49.144845 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.144820 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-62.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:49.144954 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.144855 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:49.144954 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.144866 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-62.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:49.144954 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.144910 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.145979 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.145964 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:49.153615 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.153600 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.153696 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:49.153618 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-62.ec2.internal\": node \"ip-10-0-138-62.ec2.internal\" not found" Apr 16 20:11:49.244380 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.244317 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-62.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal"] Apr 16 20:11:49.246614 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.246598 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.246696 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.246601 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.273422 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.273400 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.276770 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.276756 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.286262 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.286249 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:49.286345 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.286248 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:49.291926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.291911 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/025d9da832e6f15c9967affc15b6b9e5-config\") pod \"kube-apiserver-proxy-ip-10-0-138-62.ec2.internal\" (UID: \"025d9da832e6f15c9967affc15b6b9e5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.291971 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.291934 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/30cfa87f4eb69b9341194428842c4154-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal\" (UID: \"30cfa87f4eb69b9341194428842c4154\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.291971 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.291954 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30cfa87f4eb69b9341194428842c4154-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal\" (UID: \"30cfa87f4eb69b9341194428842c4154\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.392369 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.392347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/025d9da832e6f15c9967affc15b6b9e5-config\") pod \"kube-apiserver-proxy-ip-10-0-138-62.ec2.internal\" (UID: \"025d9da832e6f15c9967affc15b6b9e5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.392369 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.392371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/30cfa87f4eb69b9341194428842c4154-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal\" (UID: \"30cfa87f4eb69b9341194428842c4154\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.392507 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.392388 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30cfa87f4eb69b9341194428842c4154-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal\" (UID: \"30cfa87f4eb69b9341194428842c4154\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.392507 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.392442 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/025d9da832e6f15c9967affc15b6b9e5-config\") pod \"kube-apiserver-proxy-ip-10-0-138-62.ec2.internal\" (UID: \"025d9da832e6f15c9967affc15b6b9e5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.392507 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.392460 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/30cfa87f4eb69b9341194428842c4154-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal\" (UID: \"30cfa87f4eb69b9341194428842c4154\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.392507 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.392442 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30cfa87f4eb69b9341194428842c4154-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal\" (UID: \"30cfa87f4eb69b9341194428842c4154\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.588222 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.588198 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.589239 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.589212 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-62.ec2.internal" Apr 16 20:11:49.897858 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.897790 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:11:49.898423 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.897956 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:49.898423 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.897956 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:49.898423 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.897963 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:49.967568 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.967540 2572 apiserver.go:52] "Watching apiserver" Apr 16 20:11:49.972788 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.972770 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:11:49.973162 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.973141 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4fgxn","openshift-network-diagnostics/network-check-target-86kzx","openshift-network-operator/iptables-alerter-rqszv","kube-system/konnectivity-agent-fw6bf","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal","openshift-multus/multus-additional-cni-plugins-l75p2","openshift-multus/multus-lc4v5","openshift-multus/network-metrics-daemon-5bd8l","openshift-ovn-kubernetes/ovnkube-node-9gf47","kube-system/kube-apiserver-proxy-ip-10-0-138-62.ec2.internal","openshift-cluster-node-tuning-operator/tuned-np7l6","openshift-dns/node-resolver-d277z"] Apr 16 20:11:49.973752 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.973729 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:06:48 +0000 UTC" deadline="2027-11-16 23:46:14.143613802 +0000 UTC" Apr 16 20:11:49.973787 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.973753 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13899h34m24.169862677s" Apr 16 20:11:49.974573 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.974558 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4fgxn" Apr 16 20:11:49.976575 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.976556 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:11:49.976688 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.976586 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:11:49.976688 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.976629 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:11:49.976688 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.976663 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-sz2sq\"" Apr 16 20:11:49.977997 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.977981 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:11:49.978076 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.978062 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rqszv" Apr 16 20:11:49.978136 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:49.978069 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:11:49.978962 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.978946 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fw6bf" Apr 16 20:11:49.979746 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.979726 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:49.979846 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.979788 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:49.979846 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.979822 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vv7fc\"" Apr 16 20:11:49.979977 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.979889 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:11:49.980262 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.980242 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:49.980913 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.980896 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-h7ffq\"" Apr 16 20:11:49.981005 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.980925 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:11:49.981005 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.980937 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:11:49.981325 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.981309 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:49.982008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.981990 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-cr48n\"" Apr 16 20:11:49.982146 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.982008 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:11:49.982146 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.982011 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:11:49.982146 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.982038 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:11:49.982397 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.982382 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.982994 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.982975 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:11:49.983077 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.983051 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:11:49.983119 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.983081 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:11:49.983384 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.983371 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:11:49.983965 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.983744 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:49.983965 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.983928 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:11:49.983965 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:49.983957 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:11:49.985017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.984658 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-tl9k7\"" Apr 16 20:11:49.985124 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.985019 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:11:49.985124 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.985083 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-phqrk\"" Apr 16 20:11:49.987258 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.987237 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.988389 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.988370 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.988671 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.988659 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:49.989010 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.988991 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:11:49.989104 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.989086 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:11:49.989162 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.989145 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-bm2x5\"" Apr 16 20:11:49.989378 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.989361 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:11:49.989471 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.989398 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:11:49.989471 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.989435 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d277z" Apr 16 20:11:49.989471 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.989440 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:11:49.989635 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.989519 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:11:49.990176 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.990158 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:49.990577 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.990529 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wpxkn\"" Apr 16 20:11:49.990684 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.990665 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:49.990805 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.990726 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:11:49.991409 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.991384 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pvnl8\"" Apr 16 20:11:49.991409 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.991399 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:11:49.991520 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.991436 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:11:49.993842 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.993821 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-sys-fs\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:49.993935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.993855 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-sysctl-d\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.993935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.993894 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-sys\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.993935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.993913 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-run-k8s-cni-cncf-io\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.993935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.993927 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-cni-netd\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.994112 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.993943 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/10232a70-9b5f-414a-8efd-b5cff05a4f12-env-overrides\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.994112 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.993958 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-system-cni-dir\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.994112 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.993977 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-run-netns\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.994112 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.993999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-socket-dir\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:49.994112 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994031 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-os-release\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.994112 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-run-systemd\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.994112 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994076 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-etc-selinux\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:49.994370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a0139ad-ee3c-4847-b5d9-76270b598854-tmp\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.994370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994140 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp5lv\" (UniqueName: \"kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv\") pod \"network-check-target-86kzx\" (UID: \"1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9\") " pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:11:49.994370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994164 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/39673bf6-83cf-45c6-9476-3700a9d91e35-system-cni-dir\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:49.994370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-var-lib-openvswitch\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.994370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994193 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-node-log\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.994370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994207 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-cni-bin\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.994370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994224 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-host\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.994370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994262 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/39673bf6-83cf-45c6-9476-3700a9d91e35-cnibin\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:49.994370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994285 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-run-multus-certs\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.994370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994301 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-kubelet\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.994370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994319 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-registration-dir\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:49.994370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994342 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-kubernetes\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.994370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994371 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/39673bf6-83cf-45c6-9476-3700a9d91e35-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994387 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-var-lib-cni-multus\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994401 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-systemd-units\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994414 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/10232a70-9b5f-414a-8efd-b5cff05a4f12-ovn-node-metrics-cert\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994428 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/39673bf6-83cf-45c6-9476-3700a9d91e35-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994453 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-etc-openvswitch\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994473 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994489 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96p42\" (UniqueName: \"kubernetes.io/projected/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-kube-api-access-96p42\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994508 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-run\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994552 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-tuned\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8072a527-0c85-4b4a-a30d-ee0ca50bec0a-tmp-dir\") pod \"node-resolver-d277z\" (UID: \"8072a527-0c85-4b4a-a30d-ee0ca50bec0a\") " pod="openshift-dns/node-resolver-d277z" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994596 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bd52\" (UniqueName: \"kubernetes.io/projected/8072a527-0c85-4b4a-a30d-ee0ca50bec0a-kube-api-access-9bd52\") pod \"node-resolver-d277z\" (UID: \"8072a527-0c85-4b4a-a30d-ee0ca50bec0a\") " pod="openshift-dns/node-resolver-d277z" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994612 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-cni-binary-copy\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994626 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfdjq\" (UniqueName: \"kubernetes.io/projected/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-kube-api-access-lfdjq\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994652 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-sysctl-conf\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-run-netns\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.994754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994685 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b897f3d6-5177-401c-abab-c0301641c018-agent-certs\") pod \"konnectivity-agent-fw6bf\" (UID: \"b897f3d6-5177-401c-abab-c0301641c018\") " pod="kube-system/konnectivity-agent-fw6bf" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994699 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b897f3d6-5177-401c-abab-c0301641c018-konnectivity-ca\") pod \"konnectivity-agent-fw6bf\" (UID: \"b897f3d6-5177-401c-abab-c0301641c018\") " pod="kube-system/konnectivity-agent-fw6bf" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994715 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-etc-kubernetes\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994736 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-slash\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994750 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zjhq\" (UniqueName: \"kubernetes.io/projected/10232a70-9b5f-414a-8efd-b5cff05a4f12-kube-api-access-7zjhq\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994764 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-multus-conf-dir\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994791 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85b4z\" (UniqueName: \"kubernetes.io/projected/9a0139ad-ee3c-4847-b5d9-76270b598854-kube-api-access-85b4z\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994810 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c470127f-e9ca-44ba-bcef-cc2cd68cdcdc-host\") pod \"node-ca-4fgxn\" (UID: \"c470127f-e9ca-44ba-bcef-cc2cd68cdcdc\") " pod="openshift-image-registry/node-ca-4fgxn" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994824 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-var-lib-kubelet\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994838 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-hostroot\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994854 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-log-socket\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frq7m\" (UniqueName: \"kubernetes.io/projected/a880ce58-1475-4673-ae27-ce861b50e3bd-kube-api-access-frq7m\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994904 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994918 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-systemd\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994932 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8072a527-0c85-4b4a-a30d-ee0ca50bec0a-hosts-file\") pod \"node-resolver-d277z\" (UID: \"8072a527-0c85-4b4a-a30d-ee0ca50bec0a\") " pod="openshift-dns/node-resolver-d277z" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c470127f-e9ca-44ba-bcef-cc2cd68cdcdc-serviceca\") pod \"node-ca-4fgxn\" (UID: \"c470127f-e9ca-44ba-bcef-cc2cd68cdcdc\") " pod="openshift-image-registry/node-ca-4fgxn" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.994988 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb-iptables-alerter-script\") pod \"iptables-alerter-rqszv\" (UID: \"5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb\") " pod="openshift-network-operator/iptables-alerter-rqszv" Apr 16 20:11:49.995318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995016 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94kw2\" (UniqueName: \"kubernetes.io/projected/5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb-kube-api-access-94kw2\") pod \"iptables-alerter-rqszv\" (UID: \"5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb\") " pod="openshift-network-operator/iptables-alerter-rqszv" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995040 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/39673bf6-83cf-45c6-9476-3700a9d91e35-cni-binary-copy\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-run-ovn-kubernetes\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995087 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-modprobe-d\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995113 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-lib-modules\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995136 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-var-lib-kubelet\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995158 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39673bf6-83cf-45c6-9476-3700a9d91e35-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995175 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-multus-cni-dir\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995198 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-var-lib-cni-bin\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/10232a70-9b5f-414a-8efd-b5cff05a4f12-ovnkube-config\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995235 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/10232a70-9b5f-414a-8efd-b5cff05a4f12-ovnkube-script-lib\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995258 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb-host-slash\") pod \"iptables-alerter-rqszv\" (UID: \"5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb\") " pod="openshift-network-operator/iptables-alerter-rqszv" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995280 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/39673bf6-83cf-45c6-9476-3700a9d91e35-os-release\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-cnibin\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995326 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-run-ovn\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995348 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-multus-socket-dir-parent\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.995955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995364 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-multus-daemon-config\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:49.996534 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995403 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-run-openvswitch\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:49.996534 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995450 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:49.996534 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995475 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-device-dir\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:49.996534 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995501 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-sysconfig\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:49.996534 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995529 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rwb4\" (UniqueName: \"kubernetes.io/projected/c470127f-e9ca-44ba-bcef-cc2cd68cdcdc-kube-api-access-9rwb4\") pod \"node-ca-4fgxn\" (UID: \"c470127f-e9ca-44ba-bcef-cc2cd68cdcdc\") " pod="openshift-image-registry/node-ca-4fgxn" Apr 16 20:11:49.996534 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.995569 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85hdm\" (UniqueName: \"kubernetes.io/projected/39673bf6-83cf-45c6-9476-3700a9d91e35-kube-api-access-85hdm\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:49.999820 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:49.999804 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:50.023721 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.023704 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-th9lt" Apr 16 20:11:50.038523 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.038507 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-th9lt" Apr 16 20:11:50.095767 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.095744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-device-dir\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.095902 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.095773 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-sysconfig\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.095902 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.095789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwb4\" (UniqueName: \"kubernetes.io/projected/c470127f-e9ca-44ba-bcef-cc2cd68cdcdc-kube-api-access-9rwb4\") pod \"node-ca-4fgxn\" (UID: \"c470127f-e9ca-44ba-bcef-cc2cd68cdcdc\") " pod="openshift-image-registry/node-ca-4fgxn" Apr 16 20:11:50.095902 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.095810 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85hdm\" (UniqueName: \"kubernetes.io/projected/39673bf6-83cf-45c6-9476-3700a9d91e35-kube-api-access-85hdm\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.095902 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.095833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-sys-fs\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.095902 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.095884 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-device-dir\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.095902 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.095901 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-sys-fs\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.095904 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-sysconfig\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.095937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-sysctl-d\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.095986 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-sys\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-run-k8s-cni-cncf-io\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096037 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-cni-netd\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096058 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-sys\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096061 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/10232a70-9b5f-414a-8efd-b5cff05a4f12-env-overrides\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096096 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-run-k8s-cni-cncf-io\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-sysctl-d\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096099 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-system-cni-dir\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096140 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-system-cni-dir\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-run-netns\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096170 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-cni-netd\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.096197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096186 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-socket-dir\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-os-release\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-run-netns\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-run-systemd\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-os-release\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096289 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-etc-selinux\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096313 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-socket-dir\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096314 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-run-systemd\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096326 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a0139ad-ee3c-4847-b5d9-76270b598854-tmp\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096361 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-etc-selinux\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096365 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5lv\" (UniqueName: \"kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv\") pod \"network-check-target-86kzx\" (UID: \"1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9\") " pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/39673bf6-83cf-45c6-9476-3700a9d91e35-system-cni-dir\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-var-lib-openvswitch\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-node-log\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096482 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-var-lib-openvswitch\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-cni-bin\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096512 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-cni-bin\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.097050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096448 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/39673bf6-83cf-45c6-9476-3700a9d91e35-system-cni-dir\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096538 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-host\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096556 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/10232a70-9b5f-414a-8efd-b5cff05a4f12-env-overrides\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/39673bf6-83cf-45c6-9476-3700a9d91e35-cnibin\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096564 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-host\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-run-multus-certs\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096636 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/39673bf6-83cf-45c6-9476-3700a9d91e35-cnibin\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096650 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-kubelet\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-node-log\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096653 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096687 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-registration-dir\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096698 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-kubelet\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-kubernetes\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096737 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/39673bf6-83cf-45c6-9476-3700a9d91e35-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096637 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-run-multus-certs\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096760 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-var-lib-cni-multus\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096772 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-kubernetes\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-systemd-units\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.097935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096747 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-registration-dir\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096799 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-var-lib-cni-multus\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096810 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-systemd-units\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096831 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/10232a70-9b5f-414a-8efd-b5cff05a4f12-ovn-node-metrics-cert\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/39673bf6-83cf-45c6-9476-3700a9d91e35-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096890 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-etc-openvswitch\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096909 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96p42\" (UniqueName: \"kubernetes.io/projected/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-kube-api-access-96p42\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096954 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-run\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-tuned\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.096978 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-etc-openvswitch\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097010 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8072a527-0c85-4b4a-a30d-ee0ca50bec0a-tmp-dir\") pod \"node-resolver-d277z\" (UID: \"8072a527-0c85-4b4a-a30d-ee0ca50bec0a\") " pod="openshift-dns/node-resolver-d277z" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097036 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bd52\" (UniqueName: \"kubernetes.io/projected/8072a527-0c85-4b4a-a30d-ee0ca50bec0a-kube-api-access-9bd52\") pod \"node-resolver-d277z\" (UID: \"8072a527-0c85-4b4a-a30d-ee0ca50bec0a\") " pod="openshift-dns/node-resolver-d277z" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097064 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-cni-binary-copy\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097090 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdjq\" (UniqueName: \"kubernetes.io/projected/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-kube-api-access-lfdjq\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-sysctl-conf\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097137 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-run-netns\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.098761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097162 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b897f3d6-5177-401c-abab-c0301641c018-agent-certs\") pod \"konnectivity-agent-fw6bf\" (UID: \"b897f3d6-5177-401c-abab-c0301641c018\") " pod="kube-system/konnectivity-agent-fw6bf" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097187 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b897f3d6-5177-401c-abab-c0301641c018-konnectivity-ca\") pod \"konnectivity-agent-fw6bf\" (UID: \"b897f3d6-5177-401c-abab-c0301641c018\") " pod="kube-system/konnectivity-agent-fw6bf" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097212 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-etc-kubernetes\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097235 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-slash\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097251 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/39673bf6-83cf-45c6-9476-3700a9d91e35-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zjhq\" (UniqueName: \"kubernetes.io/projected/10232a70-9b5f-414a-8efd-b5cff05a4f12-kube-api-access-7zjhq\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-multus-conf-dir\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097321 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85b4z\" (UniqueName: \"kubernetes.io/projected/9a0139ad-ee3c-4847-b5d9-76270b598854-kube-api-access-85b4z\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c470127f-e9ca-44ba-bcef-cc2cd68cdcdc-host\") pod \"node-ca-4fgxn\" (UID: \"c470127f-e9ca-44ba-bcef-cc2cd68cdcdc\") " pod="openshift-image-registry/node-ca-4fgxn" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-var-lib-kubelet\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-etc-kubernetes\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-hostroot\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-log-socket\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097434 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-run-netns\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097435 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frq7m\" (UniqueName: \"kubernetes.io/projected/a880ce58-1475-4673-ae27-ce861b50e3bd-kube-api-access-frq7m\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097502 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-systemd\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097535 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8072a527-0c85-4b4a-a30d-ee0ca50bec0a-hosts-file\") pod \"node-resolver-d277z\" (UID: \"8072a527-0c85-4b4a-a30d-ee0ca50bec0a\") " pod="openshift-dns/node-resolver-d277z" Apr 16 20:11:50.099561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097561 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c470127f-e9ca-44ba-bcef-cc2cd68cdcdc-serviceca\") pod \"node-ca-4fgxn\" (UID: \"c470127f-e9ca-44ba-bcef-cc2cd68cdcdc\") " pod="openshift-image-registry/node-ca-4fgxn" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb-iptables-alerter-script\") pod \"iptables-alerter-rqszv\" (UID: \"5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb\") " pod="openshift-network-operator/iptables-alerter-rqszv" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097619 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94kw2\" (UniqueName: \"kubernetes.io/projected/5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb-kube-api-access-94kw2\") pod \"iptables-alerter-rqszv\" (UID: \"5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb\") " pod="openshift-network-operator/iptables-alerter-rqszv" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097647 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-slash\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097648 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/39673bf6-83cf-45c6-9476-3700a9d91e35-cni-binary-copy\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097674 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8072a527-0c85-4b4a-a30d-ee0ca50bec0a-tmp-dir\") pod \"node-resolver-d277z\" (UID: \"8072a527-0c85-4b4a-a30d-ee0ca50bec0a\") " pod="openshift-dns/node-resolver-d277z" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097691 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-run-ovn-kubernetes\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-modprobe-d\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097758 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-lib-modules\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-var-lib-kubelet\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097809 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39673bf6-83cf-45c6-9476-3700a9d91e35-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-multus-cni-dir\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097864 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-var-lib-cni-bin\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097894 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-sysctl-conf\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097908 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/10232a70-9b5f-414a-8efd-b5cff05a4f12-ovnkube-config\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.098153 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-multus-conf-dir\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.098228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b897f3d6-5177-401c-abab-c0301641c018-konnectivity-ca\") pod \"konnectivity-agent-fw6bf\" (UID: \"b897f3d6-5177-401c-abab-c0301641c018\") " pod="kube-system/konnectivity-agent-fw6bf" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.097322 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/39673bf6-83cf-45c6-9476-3700a9d91e35-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.100368 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:50.098323 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.098368 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-systemd\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:50.098412 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs podName:8e680955-60f7-4aaf-9aeb-b5efc9759ed4 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:50.598390789 +0000 UTC m=+2.035379300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs") pod "network-metrics-daemon-5bd8l" (UID: "8e680955-60f7-4aaf-9aeb-b5efc9759ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.098416 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8072a527-0c85-4b4a-a30d-ee0ca50bec0a-hosts-file\") pod \"node-resolver-d277z\" (UID: \"8072a527-0c85-4b4a-a30d-ee0ca50bec0a\") " pod="openshift-dns/node-resolver-d277z" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.098483 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-cni-binary-copy\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.098598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c470127f-e9ca-44ba-bcef-cc2cd68cdcdc-host\") pod \"node-ca-4fgxn\" (UID: \"c470127f-e9ca-44ba-bcef-cc2cd68cdcdc\") " pod="openshift-image-registry/node-ca-4fgxn" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.098633 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/39673bf6-83cf-45c6-9476-3700a9d91e35-cni-binary-copy\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.098643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-var-lib-kubelet\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.098682 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-hostroot\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.098719 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-log-socket\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.098754 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-run-ovn-kubernetes\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.098841 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c470127f-e9ca-44ba-bcef-cc2cd68cdcdc-serviceca\") pod \"node-ca-4fgxn\" (UID: \"c470127f-e9ca-44ba-bcef-cc2cd68cdcdc\") " pod="openshift-image-registry/node-ca-4fgxn" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.098994 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/10232a70-9b5f-414a-8efd-b5cff05a4f12-ovnkube-config\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099040 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/10232a70-9b5f-414a-8efd-b5cff05a4f12-ovnkube-script-lib\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099069 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb-host-slash\") pod \"iptables-alerter-rqszv\" (UID: \"5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb\") " pod="openshift-network-operator/iptables-alerter-rqszv" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099248 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39673bf6-83cf-45c6-9476-3700a9d91e35-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099334 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.100926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb-iptables-alerter-script\") pod \"iptables-alerter-rqszv\" (UID: \"5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb\") " pod="openshift-network-operator/iptables-alerter-rqszv" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099349 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-modprobe-d\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099392 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-run\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099429 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb-host-slash\") pod \"iptables-alerter-rqszv\" (UID: \"5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb\") " pod="openshift-network-operator/iptables-alerter-rqszv" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099440 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-var-lib-kubelet\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099468 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-host-var-lib-cni-bin\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099508 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-multus-cni-dir\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099526 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/10232a70-9b5f-414a-8efd-b5cff05a4f12-ovnkube-script-lib\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099575 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/39673bf6-83cf-45c6-9476-3700a9d91e35-os-release\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-cnibin\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099608 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a0139ad-ee3c-4847-b5d9-76270b598854-lib-modules\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099634 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-run-ovn\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099658 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-multus-socket-dir-parent\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099668 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-cnibin\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099695 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-multus-daemon-config\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099702 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-run-ovn\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/39673bf6-83cf-45c6-9476-3700a9d91e35-os-release\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099719 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-run-openvswitch\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.101455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099735 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-multus-socket-dir-parent\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.102197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099749 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.102197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099812 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a880ce58-1475-4673-ae27-ce861b50e3bd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.102197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.099751 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10232a70-9b5f-414a-8efd-b5cff05a4f12-run-openvswitch\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.102197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.100013 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a0139ad-ee3c-4847-b5d9-76270b598854-tmp\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.102197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.100040 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a0139ad-ee3c-4847-b5d9-76270b598854-etc-tuned\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.102197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.100225 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-multus-daemon-config\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.102197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.101558 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/10232a70-9b5f-414a-8efd-b5cff05a4f12-ovn-node-metrics-cert\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.102197 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.101705 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b897f3d6-5177-401c-abab-c0301641c018-agent-certs\") pod \"konnectivity-agent-fw6bf\" (UID: \"b897f3d6-5177-401c-abab-c0301641c018\") " pod="kube-system/konnectivity-agent-fw6bf" Apr 16 20:11:50.103617 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:50.103597 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:50.103711 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:50.103619 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:50.103711 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:50.103632 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sp5lv for pod openshift-network-diagnostics/network-check-target-86kzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:50.103711 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:50.103703 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv podName:1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:50.60368677 +0000 UTC m=+2.040675282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sp5lv" (UniqueName: "kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv") pod "network-check-target-86kzx" (UID: "1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:50.106158 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.106133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rwb4\" (UniqueName: \"kubernetes.io/projected/c470127f-e9ca-44ba-bcef-cc2cd68cdcdc-kube-api-access-9rwb4\") pod \"node-ca-4fgxn\" (UID: \"c470127f-e9ca-44ba-bcef-cc2cd68cdcdc\") " pod="openshift-image-registry/node-ca-4fgxn" Apr 16 20:11:50.106791 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.106763 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frq7m\" (UniqueName: \"kubernetes.io/projected/a880ce58-1475-4673-ae27-ce861b50e3bd-kube-api-access-frq7m\") pod \"aws-ebs-csi-driver-node-9z9lw\" (UID: \"a880ce58-1475-4673-ae27-ce861b50e3bd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.107171 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.107146 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bd52\" (UniqueName: \"kubernetes.io/projected/8072a527-0c85-4b4a-a30d-ee0ca50bec0a-kube-api-access-9bd52\") pod \"node-resolver-d277z\" (UID: \"8072a527-0c85-4b4a-a30d-ee0ca50bec0a\") " pod="openshift-dns/node-resolver-d277z" Apr 16 20:11:50.107542 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.107521 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zjhq\" (UniqueName: \"kubernetes.io/projected/10232a70-9b5f-414a-8efd-b5cff05a4f12-kube-api-access-7zjhq\") pod \"ovnkube-node-9gf47\" (UID: \"10232a70-9b5f-414a-8efd-b5cff05a4f12\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.108797 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.108760 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85hdm\" (UniqueName: \"kubernetes.io/projected/39673bf6-83cf-45c6-9476-3700a9d91e35-kube-api-access-85hdm\") pod \"multus-additional-cni-plugins-l75p2\" (UID: \"39673bf6-83cf-45c6-9476-3700a9d91e35\") " pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.108901 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.108764 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85b4z\" (UniqueName: \"kubernetes.io/projected/9a0139ad-ee3c-4847-b5d9-76270b598854-kube-api-access-85b4z\") pod \"tuned-np7l6\" (UID: \"9a0139ad-ee3c-4847-b5d9-76270b598854\") " pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.108999 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.108981 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94kw2\" (UniqueName: \"kubernetes.io/projected/5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb-kube-api-access-94kw2\") pod \"iptables-alerter-rqszv\" (UID: \"5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb\") " pod="openshift-network-operator/iptables-alerter-rqszv" Apr 16 20:11:50.109108 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.109088 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfdjq\" (UniqueName: \"kubernetes.io/projected/b5c961b8-9f06-4e0b-9e96-8bbef36b8380-kube-api-access-lfdjq\") pod \"multus-lc4v5\" (UID: \"b5c961b8-9f06-4e0b-9e96-8bbef36b8380\") " pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.110045 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.110023 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96p42\" (UniqueName: \"kubernetes.io/projected/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-kube-api-access-96p42\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:50.121048 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:50.121020 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30cfa87f4eb69b9341194428842c4154.slice/crio-8d15d53f2612cc3b12177d9935df2d46a2c322f11c833a58d99437fed1361106 WatchSource:0}: Error finding container 8d15d53f2612cc3b12177d9935df2d46a2c322f11c833a58d99437fed1361106: Status 404 returned error can't find the container with id 8d15d53f2612cc3b12177d9935df2d46a2c322f11c833a58d99437fed1361106 Apr 16 20:11:50.121834 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:50.121817 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod025d9da832e6f15c9967affc15b6b9e5.slice/crio-8fa0d0cb853e95dd1fbdd082ee5fef76a465b590007b8cf30696259fe6f835fb WatchSource:0}: Error finding container 8fa0d0cb853e95dd1fbdd082ee5fef76a465b590007b8cf30696259fe6f835fb: Status 404 returned error can't find the container with id 8fa0d0cb853e95dd1fbdd082ee5fef76a465b590007b8cf30696259fe6f835fb Apr 16 20:11:50.125356 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.125338 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:11:50.146541 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.146508 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-62.ec2.internal" event={"ID":"025d9da832e6f15c9967affc15b6b9e5","Type":"ContainerStarted","Data":"8fa0d0cb853e95dd1fbdd082ee5fef76a465b590007b8cf30696259fe6f835fb"} Apr 16 20:11:50.147496 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.147479 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal" event={"ID":"30cfa87f4eb69b9341194428842c4154","Type":"ContainerStarted","Data":"8d15d53f2612cc3b12177d9935df2d46a2c322f11c833a58d99437fed1361106"} Apr 16 20:11:50.295612 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.295588 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4fgxn" Apr 16 20:11:50.301586 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:50.301565 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc470127f_e9ca_44ba_bcef_cc2cd68cdcdc.slice/crio-606a7d29cf8f1cdf8a0f957adcd41caa857ff4f80f1c0e61e9f1ac3fe8d39bf6 WatchSource:0}: Error finding container 606a7d29cf8f1cdf8a0f957adcd41caa857ff4f80f1c0e61e9f1ac3fe8d39bf6: Status 404 returned error can't find the container with id 606a7d29cf8f1cdf8a0f957adcd41caa857ff4f80f1c0e61e9f1ac3fe8d39bf6 Apr 16 20:11:50.311694 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.311670 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rqszv" Apr 16 20:11:50.317391 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:50.317365 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f806ba8_a961_4ea6_8cf9_11c7a0ef33eb.slice/crio-261da3a336cde2205dd6cfc458db981f2027a904a2d2b0458108b8c467250c1b WatchSource:0}: Error finding container 261da3a336cde2205dd6cfc458db981f2027a904a2d2b0458108b8c467250c1b: Status 404 returned error can't find the container with id 261da3a336cde2205dd6cfc458db981f2027a904a2d2b0458108b8c467250c1b Apr 16 20:11:50.320191 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.320178 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fw6bf" Apr 16 20:11:50.325183 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:50.325164 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb897f3d6_5177_401c_abab_c0301641c018.slice/crio-261fed70ca31477c8235d8eddc1474bd5298eb1985650ecc1ab1c8e01baaee25 WatchSource:0}: Error finding container 261fed70ca31477c8235d8eddc1474bd5298eb1985650ecc1ab1c8e01baaee25: Status 404 returned error can't find the container with id 261fed70ca31477c8235d8eddc1474bd5298eb1985650ecc1ab1c8e01baaee25 Apr 16 20:11:50.334953 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.334939 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" Apr 16 20:11:50.341433 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:50.341415 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda880ce58_1475_4673_ae27_ce861b50e3bd.slice/crio-acef1badde73270cbaed688233aff43a50c4709a6efbb91938c99c223081efe5 WatchSource:0}: Error finding container acef1badde73270cbaed688233aff43a50c4709a6efbb91938c99c223081efe5: Status 404 returned error can't find the container with id acef1badde73270cbaed688233aff43a50c4709a6efbb91938c99c223081efe5 Apr 16 20:11:50.348239 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.348224 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l75p2" Apr 16 20:11:50.362339 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.362316 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lc4v5" Apr 16 20:11:50.368164 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:50.368148 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5c961b8_9f06_4e0b_9e96_8bbef36b8380.slice/crio-4a12fc4df309558c2ea434e3fa20b58088ba9935a3ee84b174abf56a3798a3a4 WatchSource:0}: Error finding container 4a12fc4df309558c2ea434e3fa20b58088ba9935a3ee84b174abf56a3798a3a4: Status 404 returned error can't find the container with id 4a12fc4df309558c2ea434e3fa20b58088ba9935a3ee84b174abf56a3798a3a4 Apr 16 20:11:50.377261 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.377238 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:11:50.382241 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:50.382214 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10232a70_9b5f_414a_8efd_b5cff05a4f12.slice/crio-3983bc85e9b7562dfc00cb8ffd46a2fce962634ce3fda595a7cfd3100db2b5a0 WatchSource:0}: Error finding container 3983bc85e9b7562dfc00cb8ffd46a2fce962634ce3fda595a7cfd3100db2b5a0: Status 404 returned error can't find the container with id 3983bc85e9b7562dfc00cb8ffd46a2fce962634ce3fda595a7cfd3100db2b5a0 Apr 16 20:11:50.394540 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.394521 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-np7l6" Apr 16 20:11:50.399371 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.399352 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d277z" Apr 16 20:11:50.399682 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:50.399653 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a0139ad_ee3c_4847_b5d9_76270b598854.slice/crio-bad6115e39e6ac3d1375244b182a689c69c1cfb700e3acab02cf4da8897f96e5 WatchSource:0}: Error finding container bad6115e39e6ac3d1375244b182a689c69c1cfb700e3acab02cf4da8897f96e5: Status 404 returned error can't find the container with id bad6115e39e6ac3d1375244b182a689c69c1cfb700e3acab02cf4da8897f96e5 Apr 16 20:11:50.404936 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:11:50.404918 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8072a527_0c85_4b4a_a30d_ee0ca50bec0a.slice/crio-a10046b66e9522437d9c8fd031191d158a096fbaa495aaaad3ccd778c62e7de1 WatchSource:0}: Error finding container a10046b66e9522437d9c8fd031191d158a096fbaa495aaaad3ccd778c62e7de1: Status 404 returned error can't find the container with id a10046b66e9522437d9c8fd031191d158a096fbaa495aaaad3ccd778c62e7de1 Apr 16 20:11:50.605597 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.604916 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5lv\" (UniqueName: \"kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv\") pod \"network-check-target-86kzx\" (UID: \"1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9\") " pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:11:50.605597 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.605034 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:50.605597 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:50.605129 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:50.605597 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:50.605151 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:50.605597 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:50.605213 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs podName:8e680955-60f7-4aaf-9aeb-b5efc9759ed4 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:51.605194187 +0000 UTC m=+3.042182688 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs") pod "network-metrics-daemon-5bd8l" (UID: "8e680955-60f7-4aaf-9aeb-b5efc9759ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:50.605597 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:50.605155 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:50.605597 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:50.605243 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sp5lv for pod openshift-network-diagnostics/network-check-target-86kzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:50.605597 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:50.605296 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv podName:1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:51.605278842 +0000 UTC m=+3.042267350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sp5lv" (UniqueName: "kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv") pod "network-check-target-86kzx" (UID: "1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:50.820154 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:50.820121 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:51.039671 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.039578 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:50 +0000 UTC" deadline="2027-12-12 07:29:28.765247622 +0000 UTC" Apr 16 20:11:51.039671 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.039612 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14507h17m37.725638893s" Apr 16 20:11:51.116023 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.115794 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:51.146641 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.146612 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:11:51.146792 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:51.146732 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:11:51.174992 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.174958 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-np7l6" event={"ID":"9a0139ad-ee3c-4847-b5d9-76270b598854","Type":"ContainerStarted","Data":"bad6115e39e6ac3d1375244b182a689c69c1cfb700e3acab02cf4da8897f96e5"} Apr 16 20:11:51.181548 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.181524 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:51.198354 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.198303 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lc4v5" event={"ID":"b5c961b8-9f06-4e0b-9e96-8bbef36b8380","Type":"ContainerStarted","Data":"4a12fc4df309558c2ea434e3fa20b58088ba9935a3ee84b174abf56a3798a3a4"} Apr 16 20:11:51.200037 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.200010 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l75p2" event={"ID":"39673bf6-83cf-45c6-9476-3700a9d91e35","Type":"ContainerStarted","Data":"9502148f8012fe14ccfdac456bd009263e947246fb51acc037f63e021fd7f44f"} Apr 16 20:11:51.208807 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.208782 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" event={"ID":"a880ce58-1475-4673-ae27-ce861b50e3bd","Type":"ContainerStarted","Data":"acef1badde73270cbaed688233aff43a50c4709a6efbb91938c99c223081efe5"} Apr 16 20:11:51.226844 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.226819 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fw6bf" event={"ID":"b897f3d6-5177-401c-abab-c0301641c018","Type":"ContainerStarted","Data":"261fed70ca31477c8235d8eddc1474bd5298eb1985650ecc1ab1c8e01baaee25"} Apr 16 20:11:51.235746 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.235688 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rqszv" event={"ID":"5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb","Type":"ContainerStarted","Data":"261da3a336cde2205dd6cfc458db981f2027a904a2d2b0458108b8c467250c1b"} Apr 16 20:11:51.263606 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.263534 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d277z" event={"ID":"8072a527-0c85-4b4a-a30d-ee0ca50bec0a","Type":"ContainerStarted","Data":"a10046b66e9522437d9c8fd031191d158a096fbaa495aaaad3ccd778c62e7de1"} Apr 16 20:11:51.287652 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.287594 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" event={"ID":"10232a70-9b5f-414a-8efd-b5cff05a4f12","Type":"ContainerStarted","Data":"3983bc85e9b7562dfc00cb8ffd46a2fce962634ce3fda595a7cfd3100db2b5a0"} Apr 16 20:11:51.297285 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.297222 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4fgxn" event={"ID":"c470127f-e9ca-44ba-bcef-cc2cd68cdcdc","Type":"ContainerStarted","Data":"606a7d29cf8f1cdf8a0f957adcd41caa857ff4f80f1c0e61e9f1ac3fe8d39bf6"} Apr 16 20:11:51.615893 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.615042 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5lv\" (UniqueName: \"kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv\") pod \"network-check-target-86kzx\" (UID: \"1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9\") " pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:11:51.615893 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:51.615112 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:51.615893 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:51.615238 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:51.615893 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:51.615298 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs podName:8e680955-60f7-4aaf-9aeb-b5efc9759ed4 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:53.61527946 +0000 UTC m=+5.052267952 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs") pod "network-metrics-daemon-5bd8l" (UID: "8e680955-60f7-4aaf-9aeb-b5efc9759ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:51.615893 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:51.615705 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:51.615893 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:51.615726 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:51.615893 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:51.615740 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sp5lv for pod openshift-network-diagnostics/network-check-target-86kzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:51.615893 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:51.615788 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv podName:1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:53.615772067 +0000 UTC m=+5.052760560 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-sp5lv" (UniqueName: "kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv") pod "network-check-target-86kzx" (UID: "1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:52.040477 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:52.040407 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:50 +0000 UTC" deadline="2028-01-17 05:58:32.908249276 +0000 UTC" Apr 16 20:11:52.040477 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:52.040444 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15369h46m40.867809201s" Apr 16 20:11:52.144557 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:52.143995 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:52.144557 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:52.144149 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:11:53.146828 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:53.146797 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:11:53.147295 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:53.146936 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:11:53.630736 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:53.629927 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5lv\" (UniqueName: \"kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv\") pod \"network-check-target-86kzx\" (UID: \"1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9\") " pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:11:53.630736 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:53.629996 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:53.630736 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:53.630137 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:53.630736 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:53.630202 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs podName:8e680955-60f7-4aaf-9aeb-b5efc9759ed4 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:57.630178347 +0000 UTC m=+9.067166839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs") pod "network-metrics-daemon-5bd8l" (UID: "8e680955-60f7-4aaf-9aeb-b5efc9759ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:53.630736 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:53.630291 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:53.630736 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:53.630340 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:53.630736 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:53.630353 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sp5lv for pod openshift-network-diagnostics/network-check-target-86kzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:53.630736 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:53.630397 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv podName:1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:57.63038414 +0000 UTC m=+9.067372648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-sp5lv" (UniqueName: "kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv") pod "network-check-target-86kzx" (UID: "1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:54.145239 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:54.144925 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:54.145239 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:54.145103 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:11:55.147883 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:55.146994 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:11:55.148980 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:55.148490 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:11:56.144159 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:56.144115 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:56.144327 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:56.144247 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:11:57.144849 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:57.144814 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:11:57.145314 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:57.144962 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:11:57.664583 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:57.664247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5lv\" (UniqueName: \"kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv\") pod \"network-check-target-86kzx\" (UID: \"1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9\") " pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:11:57.664583 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:57.664333 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:57.664583 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:57.664410 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:57.664583 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:57.664439 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:57.664583 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:57.664462 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:57.664583 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:57.664467 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sp5lv for pod openshift-network-diagnostics/network-check-target-86kzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:57.664583 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:57.664534 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs podName:8e680955-60f7-4aaf-9aeb-b5efc9759ed4 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:05.664514148 +0000 UTC m=+17.101502641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs") pod "network-metrics-daemon-5bd8l" (UID: "8e680955-60f7-4aaf-9aeb-b5efc9759ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:57.664583 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:57.664553 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv podName:1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:05.664544897 +0000 UTC m=+17.101533387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-sp5lv" (UniqueName: "kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv") pod "network-check-target-86kzx" (UID: "1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:58.144610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:58.144574 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:11:58.144848 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:58.144722 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:11:59.145242 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:11:59.145214 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:11:59.145687 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:11:59.145313 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:12:00.144351 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:00.144316 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:00.144528 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:00.144453 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:12:01.145007 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:01.144976 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:01.145468 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:01.145070 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:12:02.144529 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:02.144498 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:02.144688 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:02.144621 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:12:03.145108 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:03.145066 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:03.145553 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:03.145204 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:12:04.143961 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:04.143931 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:04.144122 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:04.144040 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:12:05.145093 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:05.145053 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:05.145502 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:05.145195 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:12:05.721528 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:05.721498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:05.721727 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:05.721556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5lv\" (UniqueName: \"kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv\") pod \"network-check-target-86kzx\" (UID: \"1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9\") " pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:05.721727 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:05.721669 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:05.721851 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:05.721744 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs podName:8e680955-60f7-4aaf-9aeb-b5efc9759ed4 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:21.721722101 +0000 UTC m=+33.158710590 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs") pod "network-metrics-daemon-5bd8l" (UID: "8e680955-60f7-4aaf-9aeb-b5efc9759ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:05.721851 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:05.721680 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:05.721851 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:05.721800 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:05.721851 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:05.721817 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sp5lv for pod openshift-network-diagnostics/network-check-target-86kzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:05.722058 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:05.721867 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv podName:1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:21.721851917 +0000 UTC m=+33.158840414 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-sp5lv" (UniqueName: "kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv") pod "network-check-target-86kzx" (UID: "1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:06.144259 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:06.144226 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:06.144493 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:06.144333 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:12:07.144721 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:07.144684 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:07.145191 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:07.144815 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:12:08.144337 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:08.144272 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:08.144458 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:08.144366 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:12:09.146050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.145618 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:09.146782 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:09.146155 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:12:09.330083 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.330053 2572 generic.go:358] "Generic (PLEG): container finished" podID="30cfa87f4eb69b9341194428842c4154" containerID="682fbb3cc04a6f64507b9dee97fa6c0a9e8e1f3e787ccd314dde0a26314f9ebc" exitCode=0 Apr 16 20:12:09.330194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.330126 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal" event={"ID":"30cfa87f4eb69b9341194428842c4154","Type":"ContainerDied","Data":"682fbb3cc04a6f64507b9dee97fa6c0a9e8e1f3e787ccd314dde0a26314f9ebc"} Apr 16 20:12:09.331374 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.331343 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-np7l6" event={"ID":"9a0139ad-ee3c-4847-b5d9-76270b598854","Type":"ContainerStarted","Data":"a7a51c264dfb461d44370ff28b5e2cf0fe0b33f57f252f6f6c36f7ab60ad02ce"} Apr 16 20:12:09.335850 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.335819 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lc4v5" event={"ID":"b5c961b8-9f06-4e0b-9e96-8bbef36b8380","Type":"ContainerStarted","Data":"b2e75521a56c522fea452e217e59edbdd0178c997287404c891128b78c75629e"} Apr 16 20:12:09.337220 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.337197 2572 generic.go:358] "Generic (PLEG): container finished" podID="39673bf6-83cf-45c6-9476-3700a9d91e35" containerID="c8d8588c55b7f9f9e6de147ccd4a2fbe5bdeed971db919adec57767effaf3d1f" exitCode=0 Apr 16 20:12:09.337315 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.337266 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l75p2" event={"ID":"39673bf6-83cf-45c6-9476-3700a9d91e35","Type":"ContainerDied","Data":"c8d8588c55b7f9f9e6de147ccd4a2fbe5bdeed971db919adec57767effaf3d1f"} Apr 16 20:12:09.338544 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.338488 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" event={"ID":"a880ce58-1475-4673-ae27-ce861b50e3bd","Type":"ContainerStarted","Data":"16b1339db216fad6ea6dea933a505ffb7fe78f23285d3c1967303c3841c8100a"} Apr 16 20:12:09.339669 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.339649 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fw6bf" event={"ID":"b897f3d6-5177-401c-abab-c0301641c018","Type":"ContainerStarted","Data":"aa90d77b7a9a0f6efae2205473fb25e59ac918066a625f93ad86b4a1445a198e"} Apr 16 20:12:09.340980 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.340960 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-62.ec2.internal" event={"ID":"025d9da832e6f15c9967affc15b6b9e5","Type":"ContainerStarted","Data":"5dbcf1e5643ee0ab7352328be1be622197b4547b7df79b156c9fb1e8a07a95a1"} Apr 16 20:12:09.342157 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.342134 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d277z" event={"ID":"8072a527-0c85-4b4a-a30d-ee0ca50bec0a","Type":"ContainerStarted","Data":"bdfbb81c9175832386159cbb0ab3858ac572284776fb66bb8b2da50433f089fa"} Apr 16 20:12:09.344470 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.344454 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:12:09.344762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.344739 2572 generic.go:358] "Generic (PLEG): container finished" podID="10232a70-9b5f-414a-8efd-b5cff05a4f12" containerID="6557aab521497cf7a8adc42611089de47b533cfc216332fc012ec12955d03c78" exitCode=1 Apr 16 20:12:09.344829 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.344796 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" event={"ID":"10232a70-9b5f-414a-8efd-b5cff05a4f12","Type":"ContainerStarted","Data":"4e4540c2c6e393554d85fd06cc4caa028fc92dafc5fc1f21f9afc4cde01068ad"} Apr 16 20:12:09.344829 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.344811 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" event={"ID":"10232a70-9b5f-414a-8efd-b5cff05a4f12","Type":"ContainerStarted","Data":"73343994b74909dddf3611771b241dbe5dc26fe7351b47fd3556a4323b1dd0eb"} Apr 16 20:12:09.344829 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.344824 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" event={"ID":"10232a70-9b5f-414a-8efd-b5cff05a4f12","Type":"ContainerStarted","Data":"bb7239b46e25503658124786b165b147e33f6b542c9fac2fa8e14c2875ba5075"} Apr 16 20:12:09.344996 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.344838 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" event={"ID":"10232a70-9b5f-414a-8efd-b5cff05a4f12","Type":"ContainerStarted","Data":"05b9f31e7ccbfe22e30280912fae28792d7222a954e901f35edf470e49a4201d"} Apr 16 20:12:09.344996 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.344847 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" event={"ID":"10232a70-9b5f-414a-8efd-b5cff05a4f12","Type":"ContainerDied","Data":"6557aab521497cf7a8adc42611089de47b533cfc216332fc012ec12955d03c78"} Apr 16 20:12:09.344996 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.344860 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" event={"ID":"10232a70-9b5f-414a-8efd-b5cff05a4f12","Type":"ContainerStarted","Data":"615cebdbfc25f09e6657b64a100e85cb7bc58b68a899b9db83c8eb26645a2b9f"} Apr 16 20:12:09.346032 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.346011 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4fgxn" event={"ID":"c470127f-e9ca-44ba-bcef-cc2cd68cdcdc","Type":"ContainerStarted","Data":"7fc32d8042676e7b7334492830c1bbe447572ac45a5b3fec4958d533310a9b50"} Apr 16 20:12:09.405034 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.404946 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lc4v5" podStartSLOduration=2.470994056 podStartE2EDuration="20.404932939s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="2026-04-16 20:11:50.369380521 +0000 UTC m=+1.806369008" lastFinishedPulling="2026-04-16 20:12:08.3033194 +0000 UTC m=+19.740307891" observedRunningTime="2026-04-16 20:12:09.404917296 +0000 UTC m=+20.841905805" watchObservedRunningTime="2026-04-16 20:12:09.404932939 +0000 UTC m=+20.841921448" Apr 16 20:12:09.455744 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.455707 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d277z" podStartSLOduration=2.588984035 podStartE2EDuration="20.455693625s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="2026-04-16 20:11:50.406242138 +0000 UTC m=+1.843230624" lastFinishedPulling="2026-04-16 20:12:08.272951713 +0000 UTC m=+19.709940214" observedRunningTime="2026-04-16 20:12:09.432339916 +0000 UTC m=+20.869328427" watchObservedRunningTime="2026-04-16 20:12:09.455693625 +0000 UTC m=+20.892682238" Apr 16 20:12:09.474860 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.474776 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fw6bf" podStartSLOduration=2.556204692 podStartE2EDuration="20.47476394s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="2026-04-16 20:11:50.326387117 +0000 UTC m=+1.763375615" lastFinishedPulling="2026-04-16 20:12:08.244946363 +0000 UTC m=+19.681934863" observedRunningTime="2026-04-16 20:12:09.474749281 +0000 UTC m=+20.911737789" watchObservedRunningTime="2026-04-16 20:12:09.47476394 +0000 UTC m=+20.911752448" Apr 16 20:12:09.475322 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.475294 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4fgxn" podStartSLOduration=2.5079970080000002 podStartE2EDuration="20.475286061s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="2026-04-16 20:11:50.303048167 +0000 UTC m=+1.740036657" lastFinishedPulling="2026-04-16 20:12:08.270337209 +0000 UTC m=+19.707325710" observedRunningTime="2026-04-16 20:12:09.458324475 +0000 UTC m=+20.895312977" watchObservedRunningTime="2026-04-16 20:12:09.475286061 +0000 UTC m=+20.912274569" Apr 16 20:12:09.495480 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.495441 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-np7l6" podStartSLOduration=2.625165538 podStartE2EDuration="20.495429851s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="2026-04-16 20:11:50.402103943 +0000 UTC m=+1.839092435" lastFinishedPulling="2026-04-16 20:12:08.272368251 +0000 UTC m=+19.709356748" observedRunningTime="2026-04-16 20:12:09.495414264 +0000 UTC m=+20.932402772" watchObservedRunningTime="2026-04-16 20:12:09.495429851 +0000 UTC m=+20.932418358" Apr 16 20:12:09.524681 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:09.524648 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-62.ec2.internal" podStartSLOduration=20.524638465 podStartE2EDuration="20.524638465s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:09.524571217 +0000 UTC m=+20.961559726" watchObservedRunningTime="2026-04-16 20:12:09.524638465 +0000 UTC m=+20.961626973" Apr 16 20:12:10.144855 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:10.144783 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:10.145014 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:10.144949 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:12:10.285220 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:10.285189 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:12:10.350061 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:10.350029 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal" event={"ID":"30cfa87f4eb69b9341194428842c4154","Type":"ContainerStarted","Data":"2763fa2ac79dafd8c8596b387e5497b88ae4f28ed29051cd38bfdcdc255b56e3"} Apr 16 20:12:10.351811 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:10.351784 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" event={"ID":"a880ce58-1475-4673-ae27-ce861b50e3bd","Type":"ContainerStarted","Data":"58867401def0ed9dd3de58245804ae23cd24e2737b29c7385184e9e11d60abed"} Apr 16 20:12:10.353282 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:10.353190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rqszv" event={"ID":"5f806ba8-a961-4ea6-8cf9-11c7a0ef33eb","Type":"ContainerStarted","Data":"8884eff4a7f7bd0829695842f302aa3b0ff3abdd94ee010672cb54cd9976cfa9"} Apr 16 20:12:10.369832 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:10.369786 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-62.ec2.internal" podStartSLOduration=21.369772543 podStartE2EDuration="21.369772543s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:10.369744312 +0000 UTC m=+21.806732819" watchObservedRunningTime="2026-04-16 20:12:10.369772543 +0000 UTC m=+21.806761051" Apr 16 20:12:10.387257 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:10.387214 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rqszv" podStartSLOduration=3.461244347 podStartE2EDuration="21.3871984s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="2026-04-16 20:11:50.318893719 +0000 UTC m=+1.755882223" lastFinishedPulling="2026-04-16 20:12:08.244847776 +0000 UTC m=+19.681836276" observedRunningTime="2026-04-16 20:12:10.3863842 +0000 UTC m=+21.823372699" watchObservedRunningTime="2026-04-16 20:12:10.3871984 +0000 UTC m=+21.824186910" Apr 16 20:12:11.058188 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:11.058080 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:12:10.285206006Z","UUID":"79fc33b3-10dd-468e-86e2-edba010ce256","Handler":null,"Name":"","Endpoint":""} Apr 16 20:12:11.060084 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:11.060059 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:12:11.060216 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:11.060104 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:12:11.144635 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:11.144614 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:11.144733 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:11.144703 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:12:11.358247 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:11.358028 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:12:11.358692 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:11.358665 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" event={"ID":"10232a70-9b5f-414a-8efd-b5cff05a4f12","Type":"ContainerStarted","Data":"1e041dfa099d0d2a9d367dd5c781a7dbe41fa23d8881e1919e92be652202aed6"} Apr 16 20:12:11.360644 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:11.360614 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" event={"ID":"a880ce58-1475-4673-ae27-ce861b50e3bd","Type":"ContainerStarted","Data":"691cfa475f8e723548740edf9734e4d360623b530ab756d40b56cebe556e0433"} Apr 16 20:12:11.383352 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:11.383304 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9z9lw" podStartSLOduration=1.557404505 podStartE2EDuration="22.383290733s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="2026-04-16 20:11:50.342830552 +0000 UTC m=+1.779819042" lastFinishedPulling="2026-04-16 20:12:11.168716785 +0000 UTC m=+22.605705270" observedRunningTime="2026-04-16 20:12:11.381909891 +0000 UTC m=+22.818898401" watchObservedRunningTime="2026-04-16 20:12:11.383290733 +0000 UTC m=+22.820279241" Apr 16 20:12:12.052671 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:12.052635 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fw6bf" Apr 16 20:12:12.053475 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:12.053450 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fw6bf" Apr 16 20:12:12.144935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:12.144903 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:12.145110 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:12.145048 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:12:12.362565 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:12.362485 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fw6bf" Apr 16 20:12:12.363073 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:12.362832 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fw6bf" Apr 16 20:12:13.144479 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:13.144434 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:13.144666 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:13.144535 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:12:14.144385 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:14.144198 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:14.144990 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:14.144463 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:12:14.367657 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:14.367624 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:12:14.367918 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:14.367900 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" event={"ID":"10232a70-9b5f-414a-8efd-b5cff05a4f12","Type":"ContainerStarted","Data":"4117e07d02d8ddebd7fa918e86c0a31a57fdeba9340a274eab348f2d2c8ea599"} Apr 16 20:12:14.368250 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:14.368235 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:12:14.368428 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:14.368408 2572 scope.go:117] "RemoveContainer" containerID="6557aab521497cf7a8adc42611089de47b533cfc216332fc012ec12955d03c78" Apr 16 20:12:14.369599 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:14.369576 2572 generic.go:358] "Generic (PLEG): container finished" podID="39673bf6-83cf-45c6-9476-3700a9d91e35" containerID="3cfa4da223ffbd23c043f3be478a1f9e80f85dc5419c95f59d52698f9ba34884" exitCode=0 Apr 16 20:12:14.369672 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:14.369656 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l75p2" event={"ID":"39673bf6-83cf-45c6-9476-3700a9d91e35","Type":"ContainerDied","Data":"3cfa4da223ffbd23c043f3be478a1f9e80f85dc5419c95f59d52698f9ba34884"} Apr 16 20:12:14.386645 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:14.386625 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:12:15.144988 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:15.144823 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:15.145343 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:15.145104 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:12:15.280998 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:15.280967 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5bd8l"] Apr 16 20:12:15.281129 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:15.281101 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:15.281206 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:15.281189 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:12:15.283181 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:15.283161 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-86kzx"] Apr 16 20:12:15.374189 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:15.374122 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:12:15.374458 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:15.374429 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" event={"ID":"10232a70-9b5f-414a-8efd-b5cff05a4f12","Type":"ContainerStarted","Data":"6817853a139ded6a4dc51aeb04a3115829ec034ba2ac1cce48435c2beba64a11"} Apr 16 20:12:15.374574 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:15.374559 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 20:12:15.374810 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:15.374792 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:12:15.376361 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:15.376340 2572 generic.go:358] "Generic (PLEG): container finished" podID="39673bf6-83cf-45c6-9476-3700a9d91e35" containerID="acdf6299d827eeb26c21a21f5032d6ae79f085eb4f965495c1d419898cc15548" exitCode=0 Apr 16 20:12:15.376445 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:15.376413 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:15.376445 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:15.376422 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l75p2" event={"ID":"39673bf6-83cf-45c6-9476-3700a9d91e35","Type":"ContainerDied","Data":"acdf6299d827eeb26c21a21f5032d6ae79f085eb4f965495c1d419898cc15548"} Apr 16 20:12:15.376599 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:15.376581 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:12:15.390099 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:15.390066 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:12:15.448144 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:15.448104 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" podStartSLOduration=8.309865418 podStartE2EDuration="26.448092746s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="2026-04-16 20:11:50.38368232 +0000 UTC m=+1.820670809" lastFinishedPulling="2026-04-16 20:12:08.521909636 +0000 UTC m=+19.958898137" observedRunningTime="2026-04-16 20:12:15.411127889 +0000 UTC m=+26.848116398" watchObservedRunningTime="2026-04-16 20:12:15.448092746 +0000 UTC m=+26.885081248" Apr 16 20:12:16.379920 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:16.379890 2572 generic.go:358] "Generic (PLEG): container finished" podID="39673bf6-83cf-45c6-9476-3700a9d91e35" containerID="e64436f1403fc6ae4fdc0ed5fda690030cc5fccf9efbfe4d3a740ebecd437b60" exitCode=0 Apr 16 20:12:16.380366 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:16.379971 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l75p2" event={"ID":"39673bf6-83cf-45c6-9476-3700a9d91e35","Type":"ContainerDied","Data":"e64436f1403fc6ae4fdc0ed5fda690030cc5fccf9efbfe4d3a740ebecd437b60"} Apr 16 20:12:16.380366 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:16.380180 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 20:12:16.512090 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:16.512062 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:12:17.145001 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:17.144969 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:17.145001 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:17.144992 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:17.145307 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:17.145112 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:12:17.145307 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:17.145260 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:12:18.395501 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:18.395446 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" podUID="10232a70-9b5f-414a-8efd-b5cff05a4f12" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 20:12:19.144748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:19.144718 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:19.144942 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:19.144833 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:12:19.144942 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:19.144911 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:19.145067 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:19.144986 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:12:21.144221 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.144189 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:21.144734 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.144189 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:21.144734 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:21.144328 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:12:21.144734 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:21.144415 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-86kzx" podUID="1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9" Apr 16 20:12:21.426397 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.426318 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-62.ec2.internal" event="NodeReady" Apr 16 20:12:21.426542 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.426457 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:12:21.474023 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.473980 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nwtd6"] Apr 16 20:12:21.507195 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.507159 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tngt6"] Apr 16 20:12:21.507364 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.507344 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:21.509797 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.509768 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gq66p\"" Apr 16 20:12:21.509936 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.509824 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:12:21.509936 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.509829 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:12:21.521975 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.521946 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tngt6"] Apr 16 20:12:21.521975 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.521969 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nwtd6"] Apr 16 20:12:21.522127 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.522056 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:12:21.524352 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.524332 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:12:21.524463 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.524375 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:12:21.524629 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.524602 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:12:21.524713 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.524638 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-28d82\"" Apr 16 20:12:21.638042 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.638003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqq7w\" (UniqueName: \"kubernetes.io/projected/2559e4d7-87c4-4654-a66a-cf29280da85b-kube-api-access-pqq7w\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:12:21.638204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.638075 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7621e257-90f3-4f74-a511-c5bfd075ff99-tmp-dir\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:21.638204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.638103 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:21.638204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.638120 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5pcz\" (UniqueName: \"kubernetes.io/projected/7621e257-90f3-4f74-a511-c5bfd075ff99-kube-api-access-q5pcz\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:21.638204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.638162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7621e257-90f3-4f74-a511-c5bfd075ff99-config-volume\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:21.638348 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.638207 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:12:21.739432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.739355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqq7w\" (UniqueName: \"kubernetes.io/projected/2559e4d7-87c4-4654-a66a-cf29280da85b-kube-api-access-pqq7w\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:12:21.739432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.739399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:21.739632 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:21.739514 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:21.739632 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.739553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7621e257-90f3-4f74-a511-c5bfd075ff99-tmp-dir\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:21.739632 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:21.739578 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs podName:8e680955-60f7-4aaf-9aeb-b5efc9759ed4 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:53.739559049 +0000 UTC m=+65.176547542 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs") pod "network-metrics-daemon-5bd8l" (UID: "8e680955-60f7-4aaf-9aeb-b5efc9759ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:21.739632 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.739608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:21.739632 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.739629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5pcz\" (UniqueName: \"kubernetes.io/projected/7621e257-90f3-4f74-a511-c5bfd075ff99-kube-api-access-q5pcz\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:21.739912 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.739661 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5lv\" (UniqueName: \"kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv\") pod \"network-check-target-86kzx\" (UID: \"1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9\") " pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:21.739912 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.739687 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7621e257-90f3-4f74-a511-c5bfd075ff99-config-volume\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:21.739912 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.739714 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:12:21.739912 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:21.739797 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:21.739912 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:21.739802 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:21.739912 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:21.739813 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:21.739912 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:21.739832 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:21.739912 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:21.739842 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert podName:2559e4d7-87c4-4654-a66a-cf29280da85b nodeName:}" failed. No retries permitted until 2026-04-16 20:12:22.239830452 +0000 UTC m=+33.676818941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert") pod "ingress-canary-tngt6" (UID: "2559e4d7-87c4-4654-a66a-cf29280da85b") : secret "canary-serving-cert" not found Apr 16 20:12:21.739912 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:21.739846 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sp5lv for pod openshift-network-diagnostics/network-check-target-86kzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:21.739912 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:21.739859 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls podName:7621e257-90f3-4f74-a511-c5bfd075ff99 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:22.239850343 +0000 UTC m=+33.676838834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls") pod "dns-default-nwtd6" (UID: "7621e257-90f3-4f74-a511-c5bfd075ff99") : secret "dns-default-metrics-tls" not found Apr 16 20:12:21.739912 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:21.739912 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv podName:1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:53.739896635 +0000 UTC m=+65.176885135 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-sp5lv" (UniqueName: "kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv") pod "network-check-target-86kzx" (UID: "1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:21.739912 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.739909 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7621e257-90f3-4f74-a511-c5bfd075ff99-tmp-dir\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:21.740393 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.740372 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7621e257-90f3-4f74-a511-c5bfd075ff99-config-volume\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:21.749878 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.749707 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5pcz\" (UniqueName: \"kubernetes.io/projected/7621e257-90f3-4f74-a511-c5bfd075ff99-kube-api-access-q5pcz\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:21.750018 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:21.749899 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqq7w\" (UniqueName: \"kubernetes.io/projected/2559e4d7-87c4-4654-a66a-cf29280da85b-kube-api-access-pqq7w\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:12:22.242587 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:22.242565 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:12:22.242857 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:22.242615 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:22.242857 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:22.242703 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:22.242857 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:22.242705 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:22.242857 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:22.242750 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls podName:7621e257-90f3-4f74-a511-c5bfd075ff99 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:23.24273535 +0000 UTC m=+34.679723841 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls") pod "dns-default-nwtd6" (UID: "7621e257-90f3-4f74-a511-c5bfd075ff99") : secret "dns-default-metrics-tls" not found Apr 16 20:12:22.242857 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:22.242762 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert podName:2559e4d7-87c4-4654-a66a-cf29280da85b nodeName:}" failed. No retries permitted until 2026-04-16 20:12:23.242756389 +0000 UTC m=+34.679744874 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert") pod "ingress-canary-tngt6" (UID: "2559e4d7-87c4-4654-a66a-cf29280da85b") : secret "canary-serving-cert" not found Apr 16 20:12:22.395830 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:22.395804 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l75p2" event={"ID":"39673bf6-83cf-45c6-9476-3700a9d91e35","Type":"ContainerStarted","Data":"5dbf1bf207d462292870bffd08f0b635751b9529d74de33a9cf53d8851ea1493"} Apr 16 20:12:23.144601 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:23.144570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:23.144843 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:23.144569 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:23.147139 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:23.147119 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:12:23.147257 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:23.147169 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:12:23.147988 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:23.147972 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:12:23.148079 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:23.148066 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8662n\"" Apr 16 20:12:23.148142 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:23.148073 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fgc2s\"" Apr 16 20:12:23.247800 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:23.247768 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:23.248238 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:23.247827 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:12:23.248238 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:23.247939 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:23.248238 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:23.247947 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:23.248238 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:23.248007 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert podName:2559e4d7-87c4-4654-a66a-cf29280da85b nodeName:}" failed. No retries permitted until 2026-04-16 20:12:25.247987074 +0000 UTC m=+36.684975563 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert") pod "ingress-canary-tngt6" (UID: "2559e4d7-87c4-4654-a66a-cf29280da85b") : secret "canary-serving-cert" not found Apr 16 20:12:23.248238 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:23.248026 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls podName:7621e257-90f3-4f74-a511-c5bfd075ff99 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:25.248016235 +0000 UTC m=+36.685004722 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls") pod "dns-default-nwtd6" (UID: "7621e257-90f3-4f74-a511-c5bfd075ff99") : secret "dns-default-metrics-tls" not found Apr 16 20:12:23.399888 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:23.399792 2572 generic.go:358] "Generic (PLEG): container finished" podID="39673bf6-83cf-45c6-9476-3700a9d91e35" containerID="5dbf1bf207d462292870bffd08f0b635751b9529d74de33a9cf53d8851ea1493" exitCode=0 Apr 16 20:12:23.399888 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:23.399850 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l75p2" event={"ID":"39673bf6-83cf-45c6-9476-3700a9d91e35","Type":"ContainerDied","Data":"5dbf1bf207d462292870bffd08f0b635751b9529d74de33a9cf53d8851ea1493"} Apr 16 20:12:24.403854 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:24.403821 2572 generic.go:358] "Generic (PLEG): container finished" podID="39673bf6-83cf-45c6-9476-3700a9d91e35" containerID="f68562b94dc0f8a79344c5a254c87901c9fa957895b392c1b922b3169aa8413b" exitCode=0 Apr 16 20:12:24.404247 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:24.403910 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l75p2" event={"ID":"39673bf6-83cf-45c6-9476-3700a9d91e35","Type":"ContainerDied","Data":"f68562b94dc0f8a79344c5a254c87901c9fa957895b392c1b922b3169aa8413b"} Apr 16 20:12:25.263220 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:25.263192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:25.263358 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:25.263244 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:12:25.263358 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:25.263345 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:25.263430 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:25.263396 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert podName:2559e4d7-87c4-4654-a66a-cf29280da85b nodeName:}" failed. No retries permitted until 2026-04-16 20:12:29.263382899 +0000 UTC m=+40.700371385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert") pod "ingress-canary-tngt6" (UID: "2559e4d7-87c4-4654-a66a-cf29280da85b") : secret "canary-serving-cert" not found Apr 16 20:12:25.263430 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:25.263346 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:25.263508 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:25.263461 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls podName:7621e257-90f3-4f74-a511-c5bfd075ff99 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:29.263449943 +0000 UTC m=+40.700438433 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls") pod "dns-default-nwtd6" (UID: "7621e257-90f3-4f74-a511-c5bfd075ff99") : secret "dns-default-metrics-tls" not found Apr 16 20:12:25.410200 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:25.410165 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l75p2" event={"ID":"39673bf6-83cf-45c6-9476-3700a9d91e35","Type":"ContainerStarted","Data":"d754df2b2069093b3ba0633107cb6222b810207fbee60b10526659a92db56cc8"} Apr 16 20:12:25.431388 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:25.431343 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-l75p2" podStartSLOduration=4.552605197 podStartE2EDuration="36.431330209s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="2026-04-16 20:11:50.356433505 +0000 UTC m=+1.793422007" lastFinishedPulling="2026-04-16 20:12:22.235158518 +0000 UTC m=+33.672147019" observedRunningTime="2026-04-16 20:12:25.43122435 +0000 UTC m=+36.868212859" watchObservedRunningTime="2026-04-16 20:12:25.431330209 +0000 UTC m=+36.868318716" Apr 16 20:12:29.289038 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:29.288998 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:29.289500 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:29.289050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:12:29.289500 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:29.289135 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:29.289500 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:29.289139 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:29.289500 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:29.289185 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert podName:2559e4d7-87c4-4654-a66a-cf29280da85b nodeName:}" failed. No retries permitted until 2026-04-16 20:12:37.289170453 +0000 UTC m=+48.726158939 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert") pod "ingress-canary-tngt6" (UID: "2559e4d7-87c4-4654-a66a-cf29280da85b") : secret "canary-serving-cert" not found Apr 16 20:12:29.289500 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:29.289199 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls podName:7621e257-90f3-4f74-a511-c5bfd075ff99 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:37.289192742 +0000 UTC m=+48.726181228 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls") pod "dns-default-nwtd6" (UID: "7621e257-90f3-4f74-a511-c5bfd075ff99") : secret "dns-default-metrics-tls" not found Apr 16 20:12:37.340644 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:37.340607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:37.341043 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:37.340670 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:12:37.341043 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:37.340772 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:37.341043 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:37.340825 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert podName:2559e4d7-87c4-4654-a66a-cf29280da85b nodeName:}" failed. No retries permitted until 2026-04-16 20:12:53.34081194 +0000 UTC m=+64.777800425 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert") pod "ingress-canary-tngt6" (UID: "2559e4d7-87c4-4654-a66a-cf29280da85b") : secret "canary-serving-cert" not found Apr 16 20:12:37.341043 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:37.340772 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:37.341043 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:37.340922 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls podName:7621e257-90f3-4f74-a511-c5bfd075ff99 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:53.340908264 +0000 UTC m=+64.777896756 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls") pod "dns-default-nwtd6" (UID: "7621e257-90f3-4f74-a511-c5bfd075ff99") : secret "dns-default-metrics-tls" not found Apr 16 20:12:48.392743 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:48.392707 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9gf47" Apr 16 20:12:53.347171 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:53.347128 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:12:53.347560 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:53.347185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:12:53.347560 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:53.347280 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:53.347560 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:53.347329 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert podName:2559e4d7-87c4-4654-a66a-cf29280da85b nodeName:}" failed. No retries permitted until 2026-04-16 20:13:25.347315317 +0000 UTC m=+96.784303806 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert") pod "ingress-canary-tngt6" (UID: "2559e4d7-87c4-4654-a66a-cf29280da85b") : secret "canary-serving-cert" not found Apr 16 20:12:53.347560 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:53.347280 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:53.347560 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:53.347420 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls podName:7621e257-90f3-4f74-a511-c5bfd075ff99 nodeName:}" failed. No retries permitted until 2026-04-16 20:13:25.347405325 +0000 UTC m=+96.784393815 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls") pod "dns-default-nwtd6" (UID: "7621e257-90f3-4f74-a511-c5bfd075ff99") : secret "dns-default-metrics-tls" not found Apr 16 20:12:53.751473 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:53.751437 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:12:53.751626 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:53.751502 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5lv\" (UniqueName: \"kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv\") pod \"network-check-target-86kzx\" (UID: \"1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9\") " pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:53.753997 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:53.753976 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:12:53.754050 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:53.753994 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:12:53.761977 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:53.761958 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:12:53.762031 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:12:53.762017 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs podName:8e680955-60f7-4aaf-9aeb-b5efc9759ed4 nodeName:}" failed. No retries permitted until 2026-04-16 20:13:57.762001225 +0000 UTC m=+129.198989711 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs") pod "network-metrics-daemon-5bd8l" (UID: "8e680955-60f7-4aaf-9aeb-b5efc9759ed4") : secret "metrics-daemon-secret" not found Apr 16 20:12:53.764128 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:53.764111 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:12:53.776088 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:53.776058 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp5lv\" (UniqueName: \"kubernetes.io/projected/1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9-kube-api-access-sp5lv\") pod \"network-check-target-86kzx\" (UID: \"1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9\") " pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:54.061528 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:54.061452 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fgc2s\"" Apr 16 20:12:54.068960 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:54.068940 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:54.229844 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:54.229810 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-86kzx"] Apr 16 20:12:54.234234 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:12:54.234206 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d09f25d_7edb_4aa8_a44b_bfe6b932ecf9.slice/crio-dffc1fa4d39c42cadca7eb227239097d13a12403be28725b551cd0fa2c740c43 WatchSource:0}: Error finding container dffc1fa4d39c42cadca7eb227239097d13a12403be28725b551cd0fa2c740c43: Status 404 returned error can't find the container with id dffc1fa4d39c42cadca7eb227239097d13a12403be28725b551cd0fa2c740c43 Apr 16 20:12:54.461861 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:54.461774 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-86kzx" event={"ID":"1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9","Type":"ContainerStarted","Data":"dffc1fa4d39c42cadca7eb227239097d13a12403be28725b551cd0fa2c740c43"} Apr 16 20:12:57.468361 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:57.468328 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-86kzx" event={"ID":"1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9","Type":"ContainerStarted","Data":"38971f86b293e1175075ed91553de75c15957ee78845715206a42c9f813684be"} Apr 16 20:12:57.468750 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:57.468454 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:12:57.485988 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:12:57.485937 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-86kzx" podStartSLOduration=65.808308506 podStartE2EDuration="1m8.485924435s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="2026-04-16 20:12:54.236084696 +0000 UTC m=+65.673073199" lastFinishedPulling="2026-04-16 20:12:56.91370063 +0000 UTC m=+68.350689128" observedRunningTime="2026-04-16 20:12:57.485749692 +0000 UTC m=+68.922738201" watchObservedRunningTime="2026-04-16 20:12:57.485924435 +0000 UTC m=+68.922912959" Apr 16 20:13:25.369525 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:25.369483 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:13:25.370067 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:25.369539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:13:25.370067 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:13:25.369621 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:13:25.370067 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:13:25.369661 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:13:25.370067 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:13:25.369693 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls podName:7621e257-90f3-4f74-a511-c5bfd075ff99 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:29.369678194 +0000 UTC m=+160.806666684 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls") pod "dns-default-nwtd6" (UID: "7621e257-90f3-4f74-a511-c5bfd075ff99") : secret "dns-default-metrics-tls" not found Apr 16 20:13:25.370067 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:13:25.369723 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert podName:2559e4d7-87c4-4654-a66a-cf29280da85b nodeName:}" failed. No retries permitted until 2026-04-16 20:14:29.369707146 +0000 UTC m=+160.806695631 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert") pod "ingress-canary-tngt6" (UID: "2559e4d7-87c4-4654-a66a-cf29280da85b") : secret "canary-serving-cert" not found Apr 16 20:13:28.473574 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:28.473537 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-86kzx" Apr 16 20:13:57.788620 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:57.788562 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:13:57.789125 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:13:57.788712 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:13:57.789125 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:13:57.788790 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs podName:8e680955-60f7-4aaf-9aeb-b5efc9759ed4 nodeName:}" failed. No retries permitted until 2026-04-16 20:15:59.788771868 +0000 UTC m=+251.225760373 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs") pod "network-metrics-daemon-5bd8l" (UID: "8e680955-60f7-4aaf-9aeb-b5efc9759ed4") : secret "metrics-daemon-secret" not found Apr 16 20:13:59.779789 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.779752 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs"] Apr 16 20:13:59.782472 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.782457 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:13:59.784942 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.784916 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 20:13:59.785043 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.784961 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 20:13:59.785590 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.785575 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xn4wk\"" Apr 16 20:13:59.791388 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.791369 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs"] Apr 16 20:13:59.879285 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.879253 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vv4gp"] Apr 16 20:13:59.881843 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.881827 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vv4gp" Apr 16 20:13:59.885132 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.885109 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-sd4gs\"" Apr 16 20:13:59.887338 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.887316 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-9455768c8-dnlpz"] Apr 16 20:13:59.890006 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.889989 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:13:59.893991 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.893974 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 20:13:59.894192 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.894176 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 20:13:59.896195 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.896176 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 20:13:59.896384 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.896359 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-h42dm\"" Apr 16 20:13:59.900646 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.900625 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vv4gp"] Apr 16 20:13:59.901612 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.901595 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 20:13:59.901769 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.901752 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gbgs\" (UID: \"8358b0b1-4c45-4118-bd03-a851a409b99e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:13:59.901823 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.901812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8358b0b1-4c45-4118-bd03-a851a409b99e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9gbgs\" (UID: \"8358b0b1-4c45-4118-bd03-a851a409b99e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:13:59.911149 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.911127 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9455768c8-dnlpz"] Apr 16 20:13:59.980933 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.980904 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb"] Apr 16 20:13:59.983579 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.983563 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" Apr 16 20:13:59.986402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.986381 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597"] Apr 16 20:13:59.986979 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.986958 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 20:13:59.987117 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.987094 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:13:59.987246 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.987215 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 20:13:59.988210 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.988189 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-jvvc5\"" Apr 16 20:13:59.988288 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.988250 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 20:13:59.989034 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.989018 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t"] Apr 16 20:13:59.989165 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.989147 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:13:59.990809 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.990793 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 20:13:59.992057 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.992040 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" Apr 16 20:13:59.992272 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.992131 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 20:13:59.992453 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.992289 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-nz5m9\"" Apr 16 20:13:59.992453 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.992354 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:13:59.992607 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.992374 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:13:59.992738 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.992718 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb"] Apr 16 20:13:59.994813 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.994796 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 20:13:59.994929 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.994835 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-r89t9\"" Apr 16 20:13:59.995034 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.995021 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 20:13:59.995153 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.995137 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 20:13:59.995597 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:13:59.995577 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:14:00.002425 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.002401 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da80589c-6efe-43a7-bd6f-c9394f610209-ca-trust-extracted\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.002517 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.002465 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/da80589c-6efe-43a7-bd6f-c9394f610209-image-registry-private-configuration\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.002572 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.002517 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-bound-sa-token\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.002646 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.002629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.002704 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.002661 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da80589c-6efe-43a7-bd6f-c9394f610209-registry-certificates\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.002761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.002733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8358b0b1-4c45-4118-bd03-a851a409b99e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9gbgs\" (UID: \"8358b0b1-4c45-4118-bd03-a851a409b99e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:14:00.002810 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.002761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da80589c-6efe-43a7-bd6f-c9394f610209-trusted-ca\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.002908 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.002881 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da80589c-6efe-43a7-bd6f-c9394f610209-installation-pull-secrets\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.002955 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.002940 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hprqt\" (UniqueName: \"kubernetes.io/projected/9c2c7589-5734-4bc8-8907-ab28995f2fbc-kube-api-access-hprqt\") pod \"network-check-source-8894fc9bd-vv4gp\" (UID: \"9c2c7589-5734-4bc8-8907-ab28995f2fbc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vv4gp" Apr 16 20:14:00.003007 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.002971 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gbgs\" (UID: \"8358b0b1-4c45-4118-bd03-a851a409b99e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:14:00.003056 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.002995 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrxw\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-kube-api-access-qsrxw\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.003105 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.003075 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:14:00.003152 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.003139 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert podName:8358b0b1-4c45-4118-bd03-a851a409b99e nodeName:}" failed. No retries permitted until 2026-04-16 20:14:00.50312129 +0000 UTC m=+131.940109777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gbgs" (UID: "8358b0b1-4c45-4118-bd03-a851a409b99e") : secret "networking-console-plugin-cert" not found Apr 16 20:14:00.003937 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.003920 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8358b0b1-4c45-4118-bd03-a851a409b99e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9gbgs\" (UID: \"8358b0b1-4c45-4118-bd03-a851a409b99e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:14:00.007306 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.007288 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597"] Apr 16 20:14:00.007912 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.007895 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t"] Apr 16 20:14:00.103899 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.103811 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da80589c-6efe-43a7-bd6f-c9394f610209-ca-trust-extracted\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.103899 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.103858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/da80589c-6efe-43a7-bd6f-c9394f610209-image-registry-private-configuration\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.103899 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.103899 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-bound-sa-token\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.104152 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.103922 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/280734e1-0b9c-4cc5-9274-f8058780a728-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tbxfb\" (UID: \"280734e1-0b9c-4cc5-9274-f8058780a728\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" Apr 16 20:14:00.104152 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.103945 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.104152 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.103967 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da80589c-6efe-43a7-bd6f-c9394f610209-registry-certificates\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.104152 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104022 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bl4n\" (UniqueName: \"kubernetes.io/projected/611adbad-0f1f-4c66-8f17-17f5a789a903-kube-api-access-7bl4n\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:00.104152 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.104102 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:14:00.104152 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.104124 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9455768c8-dnlpz: secret "image-registry-tls" not found Apr 16 20:14:00.104152 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104117 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b38707-68a6-4045-bbe5-f614a88439b1-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-n994t\" (UID: \"b6b38707-68a6-4045-bbe5-f614a88439b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" Apr 16 20:14:00.104461 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.104185 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls podName:da80589c-6efe-43a7-bd6f-c9394f610209 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:00.604164242 +0000 UTC m=+132.041152738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls") pod "image-registry-9455768c8-dnlpz" (UID: "da80589c-6efe-43a7-bd6f-c9394f610209") : secret "image-registry-tls" not found Apr 16 20:14:00.104461 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104229 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da80589c-6efe-43a7-bd6f-c9394f610209-ca-trust-extracted\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.104461 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104257 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da80589c-6efe-43a7-bd6f-c9394f610209-trusted-ca\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.104461 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da80589c-6efe-43a7-bd6f-c9394f610209-installation-pull-secrets\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.104461 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280734e1-0b9c-4cc5-9274-f8058780a728-config\") pod \"service-ca-operator-d6fc45fc5-tbxfb\" (UID: \"280734e1-0b9c-4cc5-9274-f8058780a728\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" Apr 16 20:14:00.104461 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104385 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv6cb\" (UniqueName: \"kubernetes.io/projected/280734e1-0b9c-4cc5-9274-f8058780a728-kube-api-access-sv6cb\") pod \"service-ca-operator-d6fc45fc5-tbxfb\" (UID: \"280734e1-0b9c-4cc5-9274-f8058780a728\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" Apr 16 20:14:00.104461 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104410 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6b38707-68a6-4045-bbe5-f614a88439b1-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-n994t\" (UID: \"b6b38707-68a6-4045-bbe5-f614a88439b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" Apr 16 20:14:00.104461 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da80589c-6efe-43a7-bd6f-c9394f610209-registry-certificates\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.104804 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104483 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hprqt\" (UniqueName: \"kubernetes.io/projected/9c2c7589-5734-4bc8-8907-ab28995f2fbc-kube-api-access-hprqt\") pod \"network-check-source-8894fc9bd-vv4gp\" (UID: \"9c2c7589-5734-4bc8-8907-ab28995f2fbc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vv4gp" Apr 16 20:14:00.104804 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104528 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99rc6\" (UniqueName: \"kubernetes.io/projected/b6b38707-68a6-4045-bbe5-f614a88439b1-kube-api-access-99rc6\") pod \"kube-storage-version-migrator-operator-6769c5d45-n994t\" (UID: \"b6b38707-68a6-4045-bbe5-f614a88439b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" Apr 16 20:14:00.104804 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104557 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/611adbad-0f1f-4c66-8f17-17f5a789a903-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:00.104804 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrxw\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-kube-api-access-qsrxw\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.104804 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.104630 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:00.105109 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.105090 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da80589c-6efe-43a7-bd6f-c9394f610209-trusted-ca\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.106431 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.106414 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/da80589c-6efe-43a7-bd6f-c9394f610209-image-registry-private-configuration\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.106503 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.106486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da80589c-6efe-43a7-bd6f-c9394f610209-installation-pull-secrets\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.112970 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.112944 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hprqt\" (UniqueName: \"kubernetes.io/projected/9c2c7589-5734-4bc8-8907-ab28995f2fbc-kube-api-access-hprqt\") pod \"network-check-source-8894fc9bd-vv4gp\" (UID: \"9c2c7589-5734-4bc8-8907-ab28995f2fbc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vv4gp" Apr 16 20:14:00.113097 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.112983 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-bound-sa-token\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.114796 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.114775 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrxw\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-kube-api-access-qsrxw\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.190131 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.190085 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vv4gp" Apr 16 20:14:00.205117 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.205088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bl4n\" (UniqueName: \"kubernetes.io/projected/611adbad-0f1f-4c66-8f17-17f5a789a903-kube-api-access-7bl4n\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:00.205226 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.205137 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b38707-68a6-4045-bbe5-f614a88439b1-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-n994t\" (UID: \"b6b38707-68a6-4045-bbe5-f614a88439b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" Apr 16 20:14:00.205226 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.205191 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280734e1-0b9c-4cc5-9274-f8058780a728-config\") pod \"service-ca-operator-d6fc45fc5-tbxfb\" (UID: \"280734e1-0b9c-4cc5-9274-f8058780a728\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" Apr 16 20:14:00.205226 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.205218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv6cb\" (UniqueName: \"kubernetes.io/projected/280734e1-0b9c-4cc5-9274-f8058780a728-kube-api-access-sv6cb\") pod \"service-ca-operator-d6fc45fc5-tbxfb\" (UID: \"280734e1-0b9c-4cc5-9274-f8058780a728\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" Apr 16 20:14:00.205381 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.205244 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6b38707-68a6-4045-bbe5-f614a88439b1-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-n994t\" (UID: \"b6b38707-68a6-4045-bbe5-f614a88439b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" Apr 16 20:14:00.205381 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.205294 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99rc6\" (UniqueName: \"kubernetes.io/projected/b6b38707-68a6-4045-bbe5-f614a88439b1-kube-api-access-99rc6\") pod \"kube-storage-version-migrator-operator-6769c5d45-n994t\" (UID: \"b6b38707-68a6-4045-bbe5-f614a88439b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" Apr 16 20:14:00.205381 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.205321 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/611adbad-0f1f-4c66-8f17-17f5a789a903-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:00.205381 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.205363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:00.205571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.205422 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/280734e1-0b9c-4cc5-9274-f8058780a728-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tbxfb\" (UID: \"280734e1-0b9c-4cc5-9274-f8058780a728\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" Apr 16 20:14:00.205571 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.205559 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:00.205666 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.205641 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls podName:611adbad-0f1f-4c66-8f17-17f5a789a903 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:00.705621012 +0000 UTC m=+132.142609512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tc597" (UID: "611adbad-0f1f-4c66-8f17-17f5a789a903") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:00.205773 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.205750 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b38707-68a6-4045-bbe5-f614a88439b1-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-n994t\" (UID: \"b6b38707-68a6-4045-bbe5-f614a88439b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" Apr 16 20:14:00.206070 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.206050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/611adbad-0f1f-4c66-8f17-17f5a789a903-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:00.206315 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.206290 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280734e1-0b9c-4cc5-9274-f8058780a728-config\") pod \"service-ca-operator-d6fc45fc5-tbxfb\" (UID: \"280734e1-0b9c-4cc5-9274-f8058780a728\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" Apr 16 20:14:00.207449 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.207426 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6b38707-68a6-4045-bbe5-f614a88439b1-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-n994t\" (UID: \"b6b38707-68a6-4045-bbe5-f614a88439b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" Apr 16 20:14:00.207580 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.207553 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/280734e1-0b9c-4cc5-9274-f8058780a728-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tbxfb\" (UID: \"280734e1-0b9c-4cc5-9274-f8058780a728\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" Apr 16 20:14:00.220472 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.220443 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99rc6\" (UniqueName: \"kubernetes.io/projected/b6b38707-68a6-4045-bbe5-f614a88439b1-kube-api-access-99rc6\") pod \"kube-storage-version-migrator-operator-6769c5d45-n994t\" (UID: \"b6b38707-68a6-4045-bbe5-f614a88439b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" Apr 16 20:14:00.220928 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.220909 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv6cb\" (UniqueName: \"kubernetes.io/projected/280734e1-0b9c-4cc5-9274-f8058780a728-kube-api-access-sv6cb\") pod \"service-ca-operator-d6fc45fc5-tbxfb\" (UID: \"280734e1-0b9c-4cc5-9274-f8058780a728\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" Apr 16 20:14:00.224741 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.224708 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bl4n\" (UniqueName: \"kubernetes.io/projected/611adbad-0f1f-4c66-8f17-17f5a789a903-kube-api-access-7bl4n\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:00.294409 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.294381 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" Apr 16 20:14:00.302654 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.302627 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vv4gp"] Apr 16 20:14:00.305365 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:00.305341 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c2c7589_5734_4bc8_8907_ab28995f2fbc.slice/crio-0b43fc5584fe47ea98b9bf25e99b7b30dca33812f82193d91df0d72663e561be WatchSource:0}: Error finding container 0b43fc5584fe47ea98b9bf25e99b7b30dca33812f82193d91df0d72663e561be: Status 404 returned error can't find the container with id 0b43fc5584fe47ea98b9bf25e99b7b30dca33812f82193d91df0d72663e561be Apr 16 20:14:00.305449 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.305413 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" Apr 16 20:14:00.425266 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.425238 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb"] Apr 16 20:14:00.427772 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:00.427747 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280734e1_0b9c_4cc5_9274_f8058780a728.slice/crio-1852a21841eb186e560b74e2336f5289bbc946c02fa0dd0e6b1a90f989382324 WatchSource:0}: Error finding container 1852a21841eb186e560b74e2336f5289bbc946c02fa0dd0e6b1a90f989382324: Status 404 returned error can't find the container with id 1852a21841eb186e560b74e2336f5289bbc946c02fa0dd0e6b1a90f989382324 Apr 16 20:14:00.436820 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.436798 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t"] Apr 16 20:14:00.440050 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:00.440022 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6b38707_68a6_4045_bbe5_f614a88439b1.slice/crio-01c593bf42733dba9978bbed2c06d272ca383ae01c1ae97ffa19132ce9fe53bb WatchSource:0}: Error finding container 01c593bf42733dba9978bbed2c06d272ca383ae01c1ae97ffa19132ce9fe53bb: Status 404 returned error can't find the container with id 01c593bf42733dba9978bbed2c06d272ca383ae01c1ae97ffa19132ce9fe53bb Apr 16 20:14:00.508122 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.508096 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gbgs\" (UID: \"8358b0b1-4c45-4118-bd03-a851a409b99e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:14:00.508274 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.508253 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:14:00.508364 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.508353 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert podName:8358b0b1-4c45-4118-bd03-a851a409b99e nodeName:}" failed. No retries permitted until 2026-04-16 20:14:01.508329716 +0000 UTC m=+132.945318219 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gbgs" (UID: "8358b0b1-4c45-4118-bd03-a851a409b99e") : secret "networking-console-plugin-cert" not found Apr 16 20:14:00.588057 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.588028 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" event={"ID":"b6b38707-68a6-4045-bbe5-f614a88439b1","Type":"ContainerStarted","Data":"01c593bf42733dba9978bbed2c06d272ca383ae01c1ae97ffa19132ce9fe53bb"} Apr 16 20:14:00.589011 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.588990 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" event={"ID":"280734e1-0b9c-4cc5-9274-f8058780a728","Type":"ContainerStarted","Data":"1852a21841eb186e560b74e2336f5289bbc946c02fa0dd0e6b1a90f989382324"} Apr 16 20:14:00.590118 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.590097 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vv4gp" event={"ID":"9c2c7589-5734-4bc8-8907-ab28995f2fbc","Type":"ContainerStarted","Data":"fd046e69c89fcdb3366b9069e308389ec7856fb81cee92e5fd37e510ac5ad616"} Apr 16 20:14:00.590178 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.590127 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vv4gp" event={"ID":"9c2c7589-5734-4bc8-8907-ab28995f2fbc","Type":"ContainerStarted","Data":"0b43fc5584fe47ea98b9bf25e99b7b30dca33812f82193d91df0d72663e561be"} Apr 16 20:14:00.608261 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.608191 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vv4gp" podStartSLOduration=1.6081764509999998 podStartE2EDuration="1.608176451s" podCreationTimestamp="2026-04-16 20:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:00.607971887 +0000 UTC m=+132.044960421" watchObservedRunningTime="2026-04-16 20:14:00.608176451 +0000 UTC m=+132.045164960" Apr 16 20:14:00.609156 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.609135 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:00.609282 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.609266 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:14:00.609325 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.609284 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9455768c8-dnlpz: secret "image-registry-tls" not found Apr 16 20:14:00.609356 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.609325 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls podName:da80589c-6efe-43a7-bd6f-c9394f610209 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:01.609314203 +0000 UTC m=+133.046302689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls") pod "image-registry-9455768c8-dnlpz" (UID: "da80589c-6efe-43a7-bd6f-c9394f610209") : secret "image-registry-tls" not found Apr 16 20:14:00.709781 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:00.709754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:00.709932 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.709917 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:00.709993 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:00.709982 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls podName:611adbad-0f1f-4c66-8f17-17f5a789a903 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:01.709964726 +0000 UTC m=+133.146953218 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tc597" (UID: "611adbad-0f1f-4c66-8f17-17f5a789a903") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:01.517104 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:01.517056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gbgs\" (UID: \"8358b0b1-4c45-4118-bd03-a851a409b99e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:14:01.517547 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:01.517225 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:14:01.517547 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:01.517306 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert podName:8358b0b1-4c45-4118-bd03-a851a409b99e nodeName:}" failed. No retries permitted until 2026-04-16 20:14:03.517285656 +0000 UTC m=+134.954274157 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gbgs" (UID: "8358b0b1-4c45-4118-bd03-a851a409b99e") : secret "networking-console-plugin-cert" not found Apr 16 20:14:01.618243 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:01.618210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:01.618409 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:01.618371 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:14:01.618409 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:01.618394 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9455768c8-dnlpz: secret "image-registry-tls" not found Apr 16 20:14:01.618503 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:01.618455 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls podName:da80589c-6efe-43a7-bd6f-c9394f610209 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:03.618436784 +0000 UTC m=+135.055425286 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls") pod "image-registry-9455768c8-dnlpz" (UID: "da80589c-6efe-43a7-bd6f-c9394f610209") : secret "image-registry-tls" not found Apr 16 20:14:01.719400 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:01.719363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:01.719587 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:01.719566 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:01.719668 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:01.719651 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls podName:611adbad-0f1f-4c66-8f17-17f5a789a903 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:03.71962968 +0000 UTC m=+135.156618179 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tc597" (UID: "611adbad-0f1f-4c66-8f17-17f5a789a903") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:03.535231 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:03.535192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gbgs\" (UID: \"8358b0b1-4c45-4118-bd03-a851a409b99e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:14:03.535810 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:03.535368 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:14:03.535810 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:03.535449 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert podName:8358b0b1-4c45-4118-bd03-a851a409b99e nodeName:}" failed. No retries permitted until 2026-04-16 20:14:07.53542961 +0000 UTC m=+138.972418121 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gbgs" (UID: "8358b0b1-4c45-4118-bd03-a851a409b99e") : secret "networking-console-plugin-cert" not found Apr 16 20:14:03.598222 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:03.598190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" event={"ID":"b6b38707-68a6-4045-bbe5-f614a88439b1","Type":"ContainerStarted","Data":"6d85974e2faa052248bd38e049bb96138ed9d00ff39a441c69a65c229a96316e"} Apr 16 20:14:03.599536 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:03.599512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" event={"ID":"280734e1-0b9c-4cc5-9274-f8058780a728","Type":"ContainerStarted","Data":"69581c307ba438089ff13af1d10e7ed24f7585743625fb82bdbc948bea45e9c4"} Apr 16 20:14:03.636046 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:03.636020 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:03.636147 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:03.636134 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:14:03.636188 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:03.636148 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9455768c8-dnlpz: secret "image-registry-tls" not found Apr 16 20:14:03.636221 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:03.636201 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls podName:da80589c-6efe-43a7-bd6f-c9394f610209 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:07.636184558 +0000 UTC m=+139.073173049 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls") pod "image-registry-9455768c8-dnlpz" (UID: "da80589c-6efe-43a7-bd6f-c9394f610209") : secret "image-registry-tls" not found Apr 16 20:14:03.676845 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:03.672548 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" podStartSLOduration=2.415595759 podStartE2EDuration="4.672533663s" podCreationTimestamp="2026-04-16 20:13:59 +0000 UTC" firstStartedPulling="2026-04-16 20:14:00.4417638 +0000 UTC m=+131.878752286" lastFinishedPulling="2026-04-16 20:14:02.698701688 +0000 UTC m=+134.135690190" observedRunningTime="2026-04-16 20:14:03.635950235 +0000 UTC m=+135.072938743" watchObservedRunningTime="2026-04-16 20:14:03.672533663 +0000 UTC m=+135.109522172" Apr 16 20:14:03.736691 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:03.736656 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:03.736827 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:03.736777 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:03.736910 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:03.736836 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls podName:611adbad-0f1f-4c66-8f17-17f5a789a903 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:07.736817625 +0000 UTC m=+139.173806110 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tc597" (UID: "611adbad-0f1f-4c66-8f17-17f5a789a903") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:06.779574 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:06.779548 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-d277z_8072a527-0c85-4b4a-a30d-ee0ca50bec0a/dns-node-resolver/0.log" Apr 16 20:14:07.108344 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.108244 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" podStartSLOduration=5.836039154 podStartE2EDuration="8.108228522s" podCreationTimestamp="2026-04-16 20:13:59 +0000 UTC" firstStartedPulling="2026-04-16 20:14:00.429629696 +0000 UTC m=+131.866618182" lastFinishedPulling="2026-04-16 20:14:02.701819052 +0000 UTC m=+134.138807550" observedRunningTime="2026-04-16 20:14:03.678103584 +0000 UTC m=+135.115092092" watchObservedRunningTime="2026-04-16 20:14:07.108228522 +0000 UTC m=+138.545217033" Apr 16 20:14:07.108635 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.108620 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pk9rz"] Apr 16 20:14:07.112506 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.112488 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-pk9rz" Apr 16 20:14:07.119210 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.119190 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 20:14:07.119302 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.119237 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 20:14:07.119965 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.119936 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 20:14:07.120060 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.120010 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 20:14:07.120311 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.120289 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-m6jjp\"" Apr 16 20:14:07.121443 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.121420 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pk9rz"] Apr 16 20:14:07.264653 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.264619 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/30932d52-867e-4990-b961-c1b0a589d6ff-signing-key\") pod \"service-ca-865cb79987-pk9rz\" (UID: \"30932d52-867e-4990-b961-c1b0a589d6ff\") " pod="openshift-service-ca/service-ca-865cb79987-pk9rz" Apr 16 20:14:07.264653 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.264652 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5g4n\" (UniqueName: \"kubernetes.io/projected/30932d52-867e-4990-b961-c1b0a589d6ff-kube-api-access-j5g4n\") pod \"service-ca-865cb79987-pk9rz\" (UID: \"30932d52-867e-4990-b961-c1b0a589d6ff\") " pod="openshift-service-ca/service-ca-865cb79987-pk9rz" Apr 16 20:14:07.264829 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.264670 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/30932d52-867e-4990-b961-c1b0a589d6ff-signing-cabundle\") pod \"service-ca-865cb79987-pk9rz\" (UID: \"30932d52-867e-4990-b961-c1b0a589d6ff\") " pod="openshift-service-ca/service-ca-865cb79987-pk9rz" Apr 16 20:14:07.366142 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.366068 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/30932d52-867e-4990-b961-c1b0a589d6ff-signing-key\") pod \"service-ca-865cb79987-pk9rz\" (UID: \"30932d52-867e-4990-b961-c1b0a589d6ff\") " pod="openshift-service-ca/service-ca-865cb79987-pk9rz" Apr 16 20:14:07.366142 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.366101 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5g4n\" (UniqueName: \"kubernetes.io/projected/30932d52-867e-4990-b961-c1b0a589d6ff-kube-api-access-j5g4n\") pod \"service-ca-865cb79987-pk9rz\" (UID: \"30932d52-867e-4990-b961-c1b0a589d6ff\") " pod="openshift-service-ca/service-ca-865cb79987-pk9rz" Apr 16 20:14:07.366142 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.366121 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/30932d52-867e-4990-b961-c1b0a589d6ff-signing-cabundle\") pod \"service-ca-865cb79987-pk9rz\" (UID: \"30932d52-867e-4990-b961-c1b0a589d6ff\") " pod="openshift-service-ca/service-ca-865cb79987-pk9rz" Apr 16 20:14:07.367183 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.367165 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/30932d52-867e-4990-b961-c1b0a589d6ff-signing-cabundle\") pod \"service-ca-865cb79987-pk9rz\" (UID: \"30932d52-867e-4990-b961-c1b0a589d6ff\") " pod="openshift-service-ca/service-ca-865cb79987-pk9rz" Apr 16 20:14:07.368493 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.368473 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/30932d52-867e-4990-b961-c1b0a589d6ff-signing-key\") pod \"service-ca-865cb79987-pk9rz\" (UID: \"30932d52-867e-4990-b961-c1b0a589d6ff\") " pod="openshift-service-ca/service-ca-865cb79987-pk9rz" Apr 16 20:14:07.374630 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.374610 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5g4n\" (UniqueName: \"kubernetes.io/projected/30932d52-867e-4990-b961-c1b0a589d6ff-kube-api-access-j5g4n\") pod \"service-ca-865cb79987-pk9rz\" (UID: \"30932d52-867e-4990-b961-c1b0a589d6ff\") " pod="openshift-service-ca/service-ca-865cb79987-pk9rz" Apr 16 20:14:07.420861 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.420841 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-pk9rz" Apr 16 20:14:07.532022 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.531995 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pk9rz"] Apr 16 20:14:07.534831 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:07.534802 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30932d52_867e_4990_b961_c1b0a589d6ff.slice/crio-5ca0f152f5ceae46daac2fb08d862666297191dbd79530bae1100a72b109ab41 WatchSource:0}: Error finding container 5ca0f152f5ceae46daac2fb08d862666297191dbd79530bae1100a72b109ab41: Status 404 returned error can't find the container with id 5ca0f152f5ceae46daac2fb08d862666297191dbd79530bae1100a72b109ab41 Apr 16 20:14:07.567825 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.567779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gbgs\" (UID: \"8358b0b1-4c45-4118-bd03-a851a409b99e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:14:07.567944 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:07.567927 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:14:07.567993 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:07.567982 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert podName:8358b0b1-4c45-4118-bd03-a851a409b99e nodeName:}" failed. No retries permitted until 2026-04-16 20:14:15.567968562 +0000 UTC m=+147.004957051 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gbgs" (UID: "8358b0b1-4c45-4118-bd03-a851a409b99e") : secret "networking-console-plugin-cert" not found Apr 16 20:14:07.609445 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.609343 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-pk9rz" event={"ID":"30932d52-867e-4990-b961-c1b0a589d6ff","Type":"ContainerStarted","Data":"29d2b6d76da6550a980bf29096d7efa3dfa27d659ee3b145b35b6f81ed0508f7"} Apr 16 20:14:07.609445 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.609383 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-pk9rz" event={"ID":"30932d52-867e-4990-b961-c1b0a589d6ff","Type":"ContainerStarted","Data":"5ca0f152f5ceae46daac2fb08d862666297191dbd79530bae1100a72b109ab41"} Apr 16 20:14:07.627528 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.627187 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-pk9rz" podStartSLOduration=0.627169237 podStartE2EDuration="627.169237ms" podCreationTimestamp="2026-04-16 20:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:07.626377453 +0000 UTC m=+139.063365960" watchObservedRunningTime="2026-04-16 20:14:07.627169237 +0000 UTC m=+139.064157746" Apr 16 20:14:07.668953 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.668922 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:07.669186 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:07.669165 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:14:07.669186 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:07.669189 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9455768c8-dnlpz: secret "image-registry-tls" not found Apr 16 20:14:07.669280 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:07.669249 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls podName:da80589c-6efe-43a7-bd6f-c9394f610209 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:15.669229841 +0000 UTC m=+147.106218343 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls") pod "image-registry-9455768c8-dnlpz" (UID: "da80589c-6efe-43a7-bd6f-c9394f610209") : secret "image-registry-tls" not found Apr 16 20:14:07.769830 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.769800 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:07.769971 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:07.769950 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:07.770030 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:07.770020 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls podName:611adbad-0f1f-4c66-8f17-17f5a789a903 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:15.770006022 +0000 UTC m=+147.206994508 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tc597" (UID: "611adbad-0f1f-4c66-8f17-17f5a789a903") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:07.785166 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:07.785141 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4fgxn_c470127f-e9ca-44ba-bcef-cc2cd68cdcdc/node-ca/0.log" Apr 16 20:14:09.586834 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:09.586804 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-n994t_b6b38707-68a6-4045-bbe5-f614a88439b1/kube-storage-version-migrator-operator/0.log" Apr 16 20:14:15.633041 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:15.633005 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gbgs\" (UID: \"8358b0b1-4c45-4118-bd03-a851a409b99e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:14:15.633458 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:15.633164 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:14:15.633458 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:15.633239 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert podName:8358b0b1-4c45-4118-bd03-a851a409b99e nodeName:}" failed. No retries permitted until 2026-04-16 20:14:31.633221872 +0000 UTC m=+163.070210362 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gbgs" (UID: "8358b0b1-4c45-4118-bd03-a851a409b99e") : secret "networking-console-plugin-cert" not found Apr 16 20:14:15.733421 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:15.733391 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:15.735662 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:15.735633 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls\") pod \"image-registry-9455768c8-dnlpz\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:15.797658 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:15.797630 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:15.833863 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:15.833819 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:15.834018 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:15.833974 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:15.834095 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:15.834083 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls podName:611adbad-0f1f-4c66-8f17-17f5a789a903 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:31.834062364 +0000 UTC m=+163.271050863 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tc597" (UID: "611adbad-0f1f-4c66-8f17-17f5a789a903") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:15.915715 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:15.915646 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9455768c8-dnlpz"] Apr 16 20:14:15.918749 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:15.918710 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda80589c_6efe_43a7_bd6f_c9394f610209.slice/crio-cfc07d1fd8c41fcfbd0c6b003953324e5d454b4ca9e46768ee1dfa32e7cf6a3e WatchSource:0}: Error finding container cfc07d1fd8c41fcfbd0c6b003953324e5d454b4ca9e46768ee1dfa32e7cf6a3e: Status 404 returned error can't find the container with id cfc07d1fd8c41fcfbd0c6b003953324e5d454b4ca9e46768ee1dfa32e7cf6a3e Apr 16 20:14:16.637885 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:16.637850 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9455768c8-dnlpz" event={"ID":"da80589c-6efe-43a7-bd6f-c9394f610209","Type":"ContainerStarted","Data":"bec66e737aedb07fe4df8a1d77f4fd35c3982e89cc1c2d8ae8691c50906cd823"} Apr 16 20:14:16.638253 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:16.637896 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9455768c8-dnlpz" event={"ID":"da80589c-6efe-43a7-bd6f-c9394f610209","Type":"ContainerStarted","Data":"cfc07d1fd8c41fcfbd0c6b003953324e5d454b4ca9e46768ee1dfa32e7cf6a3e"} Apr 16 20:14:16.638253 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:16.637989 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:16.656784 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:16.656740 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-9455768c8-dnlpz" podStartSLOduration=17.656725086 podStartE2EDuration="17.656725086s" podCreationTimestamp="2026-04-16 20:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:16.656009994 +0000 UTC m=+148.092998513" watchObservedRunningTime="2026-04-16 20:14:16.656725086 +0000 UTC m=+148.093713593" Apr 16 20:14:24.518893 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:24.518832 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-nwtd6" podUID="7621e257-90f3-4f74-a511-c5bfd075ff99" Apr 16 20:14:24.531993 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:24.531960 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-tngt6" podUID="2559e4d7-87c4-4654-a66a-cf29280da85b" Apr 16 20:14:24.657576 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:24.657550 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nwtd6" Apr 16 20:14:26.154525 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:26.154483 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-5bd8l" podUID="8e680955-60f7-4aaf-9aeb-b5efc9759ed4" Apr 16 20:14:29.437622 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.437585 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:14:29.438205 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.437657 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:14:29.440027 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.439991 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7621e257-90f3-4f74-a511-c5bfd075ff99-metrics-tls\") pod \"dns-default-nwtd6\" (UID: \"7621e257-90f3-4f74-a511-c5bfd075ff99\") " pod="openshift-dns/dns-default-nwtd6" Apr 16 20:14:29.440174 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.440053 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2559e4d7-87c4-4654-a66a-cf29280da85b-cert\") pod \"ingress-canary-tngt6\" (UID: \"2559e4d7-87c4-4654-a66a-cf29280da85b\") " pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:14:29.463320 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.463295 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gq66p\"" Apr 16 20:14:29.468071 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.468054 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nwtd6" Apr 16 20:14:29.595584 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.595555 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nwtd6"] Apr 16 20:14:29.599071 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:29.599035 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7621e257_90f3_4f74_a511_c5bfd075ff99.slice/crio-6b67336f6c649a7eb9b23eef5defe58d6612bee9d486cff912c0288b5b744713 WatchSource:0}: Error finding container 6b67336f6c649a7eb9b23eef5defe58d6612bee9d486cff912c0288b5b744713: Status 404 returned error can't find the container with id 6b67336f6c649a7eb9b23eef5defe58d6612bee9d486cff912c0288b5b744713 Apr 16 20:14:29.609889 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.609846 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-84qmv"] Apr 16 20:14:29.613983 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.613968 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.617613 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.617595 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:14:29.617850 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.617833 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-gwklr\"" Apr 16 20:14:29.618089 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.618076 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:14:29.618627 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.618611 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:14:29.618696 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.618680 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:14:29.629827 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.629797 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-84qmv"] Apr 16 20:14:29.665088 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.665068 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9455768c8-dnlpz"] Apr 16 20:14:29.669474 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.669449 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nwtd6" event={"ID":"7621e257-90f3-4f74-a511-c5bfd075ff99","Type":"ContainerStarted","Data":"6b67336f6c649a7eb9b23eef5defe58d6612bee9d486cff912c0288b5b744713"} Apr 16 20:14:29.703075 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.703016 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6676cf5bc-962mq"] Apr 16 20:14:29.705771 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.705755 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.718788 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.718770 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6676cf5bc-962mq"] Apr 16 20:14:29.739235 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.739215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b2a87320-5da2-4e40-93d7-fffcb5c0c165-data-volume\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.739334 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.739242 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b2a87320-5da2-4e40-93d7-fffcb5c0c165-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.739334 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.739274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tcvr\" (UniqueName: \"kubernetes.io/projected/b2a87320-5da2-4e40-93d7-fffcb5c0c165-kube-api-access-4tcvr\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.739334 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.739297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2a87320-5da2-4e40-93d7-fffcb5c0c165-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.739444 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.739382 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b2a87320-5da2-4e40-93d7-fffcb5c0c165-crio-socket\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.839897 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.839847 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tcvr\" (UniqueName: \"kubernetes.io/projected/b2a87320-5da2-4e40-93d7-fffcb5c0c165-kube-api-access-4tcvr\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.840067 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.839902 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-594ln\" (UniqueName: \"kubernetes.io/projected/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-kube-api-access-594ln\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.840067 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.839936 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2a87320-5da2-4e40-93d7-fffcb5c0c165-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.840067 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.839954 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-installation-pull-secrets\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.840067 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.839983 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-image-registry-private-configuration\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.840067 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.840015 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b2a87320-5da2-4e40-93d7-fffcb5c0c165-crio-socket\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.840245 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.840070 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-trusted-ca\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.840245 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.840123 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b2a87320-5da2-4e40-93d7-fffcb5c0c165-crio-socket\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.840245 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.840149 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b2a87320-5da2-4e40-93d7-fffcb5c0c165-data-volume\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.840245 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.840187 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-ca-trust-extracted\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.840245 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.840219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b2a87320-5da2-4e40-93d7-fffcb5c0c165-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.840245 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.840245 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-bound-sa-token\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.840460 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.840283 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-registry-tls\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.840460 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.840368 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-registry-certificates\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.840460 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.840441 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b2a87320-5da2-4e40-93d7-fffcb5c0c165-data-volume\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.840678 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.840660 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b2a87320-5da2-4e40-93d7-fffcb5c0c165-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.842219 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.842203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2a87320-5da2-4e40-93d7-fffcb5c0c165-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.850763 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.850744 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tcvr\" (UniqueName: \"kubernetes.io/projected/b2a87320-5da2-4e40-93d7-fffcb5c0c165-kube-api-access-4tcvr\") pod \"insights-runtime-extractor-84qmv\" (UID: \"b2a87320-5da2-4e40-93d7-fffcb5c0c165\") " pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.922856 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.922823 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-84qmv" Apr 16 20:14:29.941719 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.941690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-ca-trust-extracted\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.941814 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.941725 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-bound-sa-token\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.941814 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.941759 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-registry-tls\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.941814 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.941784 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-registry-certificates\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.941981 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.941818 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-594ln\" (UniqueName: \"kubernetes.io/projected/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-kube-api-access-594ln\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.941981 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.941855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-installation-pull-secrets\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.941981 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.941906 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-image-registry-private-configuration\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.941981 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.941969 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-trusted-ca\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.942204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.942148 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-ca-trust-extracted\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.943071 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.942949 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-trusted-ca\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.943260 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.943241 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-registry-certificates\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.944528 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.944505 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-registry-tls\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.944796 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.944778 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-installation-pull-secrets\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.944883 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.944851 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-image-registry-private-configuration\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.955905 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.955795 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-594ln\" (UniqueName: \"kubernetes.io/projected/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-kube-api-access-594ln\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:29.957895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:29.957862 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93ae2cda-12dd-4710-b8f7-478ee5f42cf1-bound-sa-token\") pod \"image-registry-6676cf5bc-962mq\" (UID: \"93ae2cda-12dd-4710-b8f7-478ee5f42cf1\") " pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:30.013425 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:30.013387 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:30.040222 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:30.040186 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-84qmv"] Apr 16 20:14:30.044245 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:30.044194 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2a87320_5da2_4e40_93d7_fffcb5c0c165.slice/crio-870ec1f731f9a962da526a6bd7e69ed90e11a8177cd0a5c829cb40d4a8f02b0f WatchSource:0}: Error finding container 870ec1f731f9a962da526a6bd7e69ed90e11a8177cd0a5c829cb40d4a8f02b0f: Status 404 returned error can't find the container with id 870ec1f731f9a962da526a6bd7e69ed90e11a8177cd0a5c829cb40d4a8f02b0f Apr 16 20:14:30.149692 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:30.149651 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6676cf5bc-962mq"] Apr 16 20:14:30.152701 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:30.152674 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ae2cda_12dd_4710_b8f7_478ee5f42cf1.slice/crio-3b6744eb6de56c7077b0adf15ccc767e46965841b59633bb8f31b870184919dd WatchSource:0}: Error finding container 3b6744eb6de56c7077b0adf15ccc767e46965841b59633bb8f31b870184919dd: Status 404 returned error can't find the container with id 3b6744eb6de56c7077b0adf15ccc767e46965841b59633bb8f31b870184919dd Apr 16 20:14:30.674423 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:30.674385 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-84qmv" event={"ID":"b2a87320-5da2-4e40-93d7-fffcb5c0c165","Type":"ContainerStarted","Data":"9686b24e64eda774b6a354875d8bca228164d844b1dee1ab0b9adfa51daf650f"} Apr 16 20:14:30.674423 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:30.674429 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-84qmv" event={"ID":"b2a87320-5da2-4e40-93d7-fffcb5c0c165","Type":"ContainerStarted","Data":"870ec1f731f9a962da526a6bd7e69ed90e11a8177cd0a5c829cb40d4a8f02b0f"} Apr 16 20:14:30.675917 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:30.675887 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6676cf5bc-962mq" event={"ID":"93ae2cda-12dd-4710-b8f7-478ee5f42cf1","Type":"ContainerStarted","Data":"89a74ba0b6046730ee5d6284881f77fb18b9ff6d02636df7aff2202f76d5e8c2"} Apr 16 20:14:30.676026 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:30.675924 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6676cf5bc-962mq" event={"ID":"93ae2cda-12dd-4710-b8f7-478ee5f42cf1","Type":"ContainerStarted","Data":"3b6744eb6de56c7077b0adf15ccc767e46965841b59633bb8f31b870184919dd"} Apr 16 20:14:30.676080 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:30.676057 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:30.695470 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:30.695432 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6676cf5bc-962mq" podStartSLOduration=1.695419126 podStartE2EDuration="1.695419126s" podCreationTimestamp="2026-04-16 20:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:30.693894995 +0000 UTC m=+162.130883503" watchObservedRunningTime="2026-04-16 20:14:30.695419126 +0000 UTC m=+162.132407627" Apr 16 20:14:31.655333 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:31.655283 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gbgs\" (UID: \"8358b0b1-4c45-4118-bd03-a851a409b99e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:14:31.658143 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:31.658114 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8358b0b1-4c45-4118-bd03-a851a409b99e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gbgs\" (UID: \"8358b0b1-4c45-4118-bd03-a851a409b99e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:14:31.681302 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:31.681254 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nwtd6" event={"ID":"7621e257-90f3-4f74-a511-c5bfd075ff99","Type":"ContainerStarted","Data":"1935520a1b8e322257fd989ef1bf5ffbb6f06475cbb30188b82dfb66b3f9ef25"} Apr 16 20:14:31.681302 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:31.681292 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nwtd6" event={"ID":"7621e257-90f3-4f74-a511-c5bfd075ff99","Type":"ContainerStarted","Data":"b9d27c8071ecfaf8244bdc3815bca4b59a9e0ae847c231ec51779877929b59c2"} Apr 16 20:14:31.681781 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:31.681470 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-nwtd6" Apr 16 20:14:31.682922 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:31.682898 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-84qmv" event={"ID":"b2a87320-5da2-4e40-93d7-fffcb5c0c165","Type":"ContainerStarted","Data":"3a7bcf5ef020e960f479a5cd2aaea9bf5769be07da26b92320581143eefb2838"} Apr 16 20:14:31.856860 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:31.856812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:31.859735 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:31.859712 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/611adbad-0f1f-4c66-8f17-17f5a789a903-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tc597\" (UID: \"611adbad-0f1f-4c66-8f17-17f5a789a903\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:31.890709 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:31.890681 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" Apr 16 20:14:32.011863 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:32.011810 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nwtd6" podStartSLOduration=129.623500787 podStartE2EDuration="2m11.011792721s" podCreationTimestamp="2026-04-16 20:12:21 +0000 UTC" firstStartedPulling="2026-04-16 20:14:29.600919014 +0000 UTC m=+161.037907503" lastFinishedPulling="2026-04-16 20:14:30.989210942 +0000 UTC m=+162.426199437" observedRunningTime="2026-04-16 20:14:31.701223761 +0000 UTC m=+163.138212270" watchObservedRunningTime="2026-04-16 20:14:32.011792721 +0000 UTC m=+163.448781244" Apr 16 20:14:32.012045 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:32.012030 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs"] Apr 16 20:14:32.100753 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:32.100726 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" Apr 16 20:14:32.320345 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:32.320289 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8358b0b1_4c45_4118_bd03_a851a409b99e.slice/crio-bed3b2d2d31b0852ba8623e165184e246fcbdd0c0d8f1b5aa4f8fa9b7e92d08e WatchSource:0}: Error finding container bed3b2d2d31b0852ba8623e165184e246fcbdd0c0d8f1b5aa4f8fa9b7e92d08e: Status 404 returned error can't find the container with id bed3b2d2d31b0852ba8623e165184e246fcbdd0c0d8f1b5aa4f8fa9b7e92d08e Apr 16 20:14:32.440075 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:32.440047 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597"] Apr 16 20:14:32.443613 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:32.443589 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod611adbad_0f1f_4c66_8f17_17f5a789a903.slice/crio-4be9ebb5c0b5b5fe5d5464c2215c42effa3d851f19df06e90d0532822368b541 WatchSource:0}: Error finding container 4be9ebb5c0b5b5fe5d5464c2215c42effa3d851f19df06e90d0532822368b541: Status 404 returned error can't find the container with id 4be9ebb5c0b5b5fe5d5464c2215c42effa3d851f19df06e90d0532822368b541 Apr 16 20:14:32.686454 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:32.686378 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" event={"ID":"8358b0b1-4c45-4118-bd03-a851a409b99e","Type":"ContainerStarted","Data":"bed3b2d2d31b0852ba8623e165184e246fcbdd0c0d8f1b5aa4f8fa9b7e92d08e"} Apr 16 20:14:32.687395 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:32.687369 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" event={"ID":"611adbad-0f1f-4c66-8f17-17f5a789a903","Type":"ContainerStarted","Data":"4be9ebb5c0b5b5fe5d5464c2215c42effa3d851f19df06e90d0532822368b541"} Apr 16 20:14:32.688996 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:32.688975 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-84qmv" event={"ID":"b2a87320-5da2-4e40-93d7-fffcb5c0c165","Type":"ContainerStarted","Data":"b1f122c4a068970d41ff856a0b14dc0106aa1daa3e3e840bf94e4cb9d65587e5"} Apr 16 20:14:32.708658 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:32.708606 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-84qmv" podStartSLOduration=1.440237894 podStartE2EDuration="3.708593882s" podCreationTimestamp="2026-04-16 20:14:29 +0000 UTC" firstStartedPulling="2026-04-16 20:14:30.107100808 +0000 UTC m=+161.544089308" lastFinishedPulling="2026-04-16 20:14:32.375456798 +0000 UTC m=+163.812445296" observedRunningTime="2026-04-16 20:14:32.707968384 +0000 UTC m=+164.144956891" watchObservedRunningTime="2026-04-16 20:14:32.708593882 +0000 UTC m=+164.145582380" Apr 16 20:14:33.696461 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:33.696421 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" event={"ID":"8358b0b1-4c45-4118-bd03-a851a409b99e","Type":"ContainerStarted","Data":"436bdc2c0d28126a412b51393f04ec365e55d6c0c75c22b33bd8bd6a2d8911b0"} Apr 16 20:14:33.715349 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:33.715307 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gbgs" podStartSLOduration=33.815559179 podStartE2EDuration="34.715294608s" podCreationTimestamp="2026-04-16 20:13:59 +0000 UTC" firstStartedPulling="2026-04-16 20:14:32.325175273 +0000 UTC m=+163.762163759" lastFinishedPulling="2026-04-16 20:14:33.2249107 +0000 UTC m=+164.661899188" observedRunningTime="2026-04-16 20:14:33.712904851 +0000 UTC m=+165.149893362" watchObservedRunningTime="2026-04-16 20:14:33.715294608 +0000 UTC m=+165.152283115" Apr 16 20:14:34.594345 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:34.594312 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6rfzr"] Apr 16 20:14:34.597033 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:34.597014 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6rfzr" Apr 16 20:14:34.599610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:34.599588 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 20:14:34.599716 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:34.599666 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-96bcg\"" Apr 16 20:14:34.605344 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:34.605316 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6rfzr"] Apr 16 20:14:34.699890 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:34.699839 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" event={"ID":"611adbad-0f1f-4c66-8f17-17f5a789a903","Type":"ContainerStarted","Data":"5697c7a26c418c8b35858d0fed47bb9b5a6e40f50f671f87fef6448e02c6ec5e"} Apr 16 20:14:34.716366 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:34.716314 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tc597" podStartSLOduration=34.073104975 podStartE2EDuration="35.716297239s" podCreationTimestamp="2026-04-16 20:13:59 +0000 UTC" firstStartedPulling="2026-04-16 20:14:32.445367975 +0000 UTC m=+163.882356462" lastFinishedPulling="2026-04-16 20:14:34.088560237 +0000 UTC m=+165.525548726" observedRunningTime="2026-04-16 20:14:34.715167559 +0000 UTC m=+166.152156079" watchObservedRunningTime="2026-04-16 20:14:34.716297239 +0000 UTC m=+166.153285748" Apr 16 20:14:34.781422 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:34.781391 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3958a2b6-30b2-4633-b471-4e059b8de73a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6rfzr\" (UID: \"3958a2b6-30b2-4633-b471-4e059b8de73a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6rfzr" Apr 16 20:14:34.882792 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:34.882716 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3958a2b6-30b2-4633-b471-4e059b8de73a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6rfzr\" (UID: \"3958a2b6-30b2-4633-b471-4e059b8de73a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6rfzr" Apr 16 20:14:34.885235 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:34.885213 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3958a2b6-30b2-4633-b471-4e059b8de73a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6rfzr\" (UID: \"3958a2b6-30b2-4633-b471-4e059b8de73a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6rfzr" Apr 16 20:14:34.906635 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:34.906613 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6rfzr" Apr 16 20:14:35.014992 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:35.014964 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6rfzr"] Apr 16 20:14:35.017974 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:35.017933 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3958a2b6_30b2_4633_b471_4e059b8de73a.slice/crio-4daf8cca95c4c5b0d64b562b1bf84215899cb9506ab4b8ec3e01ba2c2202c3df WatchSource:0}: Error finding container 4daf8cca95c4c5b0d64b562b1bf84215899cb9506ab4b8ec3e01ba2c2202c3df: Status 404 returned error can't find the container with id 4daf8cca95c4c5b0d64b562b1bf84215899cb9506ab4b8ec3e01ba2c2202c3df Apr 16 20:14:35.704134 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:35.704100 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6rfzr" event={"ID":"3958a2b6-30b2-4633-b471-4e059b8de73a","Type":"ContainerStarted","Data":"4daf8cca95c4c5b0d64b562b1bf84215899cb9506ab4b8ec3e01ba2c2202c3df"} Apr 16 20:14:36.144589 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:36.144553 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:14:36.147101 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:36.147082 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-28d82\"" Apr 16 20:14:36.154879 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:36.154854 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tngt6" Apr 16 20:14:36.267845 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:36.267814 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tngt6"] Apr 16 20:14:36.270960 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:36.270929 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2559e4d7_87c4_4654_a66a_cf29280da85b.slice/crio-3c57a625f5b97ac33ffa8011a7fdee30c705da763a269063780c79b78603684d WatchSource:0}: Error finding container 3c57a625f5b97ac33ffa8011a7fdee30c705da763a269063780c79b78603684d: Status 404 returned error can't find the container with id 3c57a625f5b97ac33ffa8011a7fdee30c705da763a269063780c79b78603684d Apr 16 20:14:36.708428 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:36.708396 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tngt6" event={"ID":"2559e4d7-87c4-4654-a66a-cf29280da85b","Type":"ContainerStarted","Data":"3c57a625f5b97ac33ffa8011a7fdee30c705da763a269063780c79b78603684d"} Apr 16 20:14:36.710047 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:36.710017 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6rfzr" event={"ID":"3958a2b6-30b2-4633-b471-4e059b8de73a","Type":"ContainerStarted","Data":"dcc7bbaaaa1be42cc25ea2d48b7c2384a7c85746cba73451c323be2c07c4b05a"} Apr 16 20:14:36.710247 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:36.710210 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6rfzr" Apr 16 20:14:36.716288 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:36.716255 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6rfzr" Apr 16 20:14:36.724880 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:36.724827 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6rfzr" podStartSLOduration=1.78394109 podStartE2EDuration="2.724816031s" podCreationTimestamp="2026-04-16 20:14:34 +0000 UTC" firstStartedPulling="2026-04-16 20:14:35.019911786 +0000 UTC m=+166.456900271" lastFinishedPulling="2026-04-16 20:14:35.960786727 +0000 UTC m=+167.397775212" observedRunningTime="2026-04-16 20:14:36.724386203 +0000 UTC m=+168.161374712" watchObservedRunningTime="2026-04-16 20:14:36.724816031 +0000 UTC m=+168.161804539" Apr 16 20:14:37.144468 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.144433 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:14:37.662712 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.662676 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v98ll"] Apr 16 20:14:37.666433 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.666414 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:37.670121 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.670102 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 20:14:37.670240 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.670123 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 20:14:37.670499 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.670455 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-26z2g\"" Apr 16 20:14:37.671128 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.671036 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:14:37.677990 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.677809 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v98ll"] Apr 16 20:14:37.703306 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.703285 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/37613547-0ef2-4819-bd32-36b4865cf714-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v98ll\" (UID: \"37613547-0ef2-4819-bd32-36b4865cf714\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:37.703413 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.703326 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/37613547-0ef2-4819-bd32-36b4865cf714-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v98ll\" (UID: \"37613547-0ef2-4819-bd32-36b4865cf714\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:37.703413 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.703356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fcws\" (UniqueName: \"kubernetes.io/projected/37613547-0ef2-4819-bd32-36b4865cf714-kube-api-access-9fcws\") pod \"prometheus-operator-5676c8c784-v98ll\" (UID: \"37613547-0ef2-4819-bd32-36b4865cf714\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:37.703413 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.703408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/37613547-0ef2-4819-bd32-36b4865cf714-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v98ll\" (UID: \"37613547-0ef2-4819-bd32-36b4865cf714\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:37.804551 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.804516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/37613547-0ef2-4819-bd32-36b4865cf714-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v98ll\" (UID: \"37613547-0ef2-4819-bd32-36b4865cf714\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:37.804959 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.804639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/37613547-0ef2-4819-bd32-36b4865cf714-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v98ll\" (UID: \"37613547-0ef2-4819-bd32-36b4865cf714\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:37.804959 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.804680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/37613547-0ef2-4819-bd32-36b4865cf714-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v98ll\" (UID: \"37613547-0ef2-4819-bd32-36b4865cf714\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:37.804959 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.804708 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fcws\" (UniqueName: \"kubernetes.io/projected/37613547-0ef2-4819-bd32-36b4865cf714-kube-api-access-9fcws\") pod \"prometheus-operator-5676c8c784-v98ll\" (UID: \"37613547-0ef2-4819-bd32-36b4865cf714\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:37.804959 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:37.804762 2572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 20:14:37.804959 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:37.804835 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37613547-0ef2-4819-bd32-36b4865cf714-prometheus-operator-tls podName:37613547-0ef2-4819-bd32-36b4865cf714 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:38.30481295 +0000 UTC m=+169.741801449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/37613547-0ef2-4819-bd32-36b4865cf714-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-v98ll" (UID: "37613547-0ef2-4819-bd32-36b4865cf714") : secret "prometheus-operator-tls" not found Apr 16 20:14:37.805365 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.805339 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/37613547-0ef2-4819-bd32-36b4865cf714-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v98ll\" (UID: \"37613547-0ef2-4819-bd32-36b4865cf714\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:37.807919 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.807892 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/37613547-0ef2-4819-bd32-36b4865cf714-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v98ll\" (UID: \"37613547-0ef2-4819-bd32-36b4865cf714\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:37.814258 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:37.814236 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fcws\" (UniqueName: \"kubernetes.io/projected/37613547-0ef2-4819-bd32-36b4865cf714-kube-api-access-9fcws\") pod \"prometheus-operator-5676c8c784-v98ll\" (UID: \"37613547-0ef2-4819-bd32-36b4865cf714\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:38.307518 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:38.307479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/37613547-0ef2-4819-bd32-36b4865cf714-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v98ll\" (UID: \"37613547-0ef2-4819-bd32-36b4865cf714\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:38.309823 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:38.309800 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/37613547-0ef2-4819-bd32-36b4865cf714-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v98ll\" (UID: \"37613547-0ef2-4819-bd32-36b4865cf714\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:38.581037 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:38.580953 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" Apr 16 20:14:38.711411 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:38.711378 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v98ll"] Apr 16 20:14:38.714532 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:38.714508 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37613547_0ef2_4819_bd32_36b4865cf714.slice/crio-58f50aa777566c20308f0fb35914e6f89d19e25a67e171145d453f64752ccebe WatchSource:0}: Error finding container 58f50aa777566c20308f0fb35914e6f89d19e25a67e171145d453f64752ccebe: Status 404 returned error can't find the container with id 58f50aa777566c20308f0fb35914e6f89d19e25a67e171145d453f64752ccebe Apr 16 20:14:38.716616 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:38.716586 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tngt6" event={"ID":"2559e4d7-87c4-4654-a66a-cf29280da85b","Type":"ContainerStarted","Data":"dccfb9893afe3adb616cab28ba760824a32facada56387dd4807f62874acb355"} Apr 16 20:14:38.748830 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:38.748784 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tngt6" podStartSLOduration=136.308174237 podStartE2EDuration="2m17.748771081s" podCreationTimestamp="2026-04-16 20:12:21 +0000 UTC" firstStartedPulling="2026-04-16 20:14:36.272917243 +0000 UTC m=+167.709905730" lastFinishedPulling="2026-04-16 20:14:37.713514088 +0000 UTC m=+169.150502574" observedRunningTime="2026-04-16 20:14:38.74778263 +0000 UTC m=+170.184771135" watchObservedRunningTime="2026-04-16 20:14:38.748771081 +0000 UTC m=+170.185759589" Apr 16 20:14:39.671675 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:39.671647 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:39.720832 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:39.720797 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" event={"ID":"37613547-0ef2-4819-bd32-36b4865cf714","Type":"ContainerStarted","Data":"58f50aa777566c20308f0fb35914e6f89d19e25a67e171145d453f64752ccebe"} Apr 16 20:14:40.724887 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:40.724841 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" event={"ID":"37613547-0ef2-4819-bd32-36b4865cf714","Type":"ContainerStarted","Data":"8fc60bfa23e070fdf42704b1b841b50dc8d283fb05d0e7a56156c30bcf38f26e"} Apr 16 20:14:40.725233 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:40.724899 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" event={"ID":"37613547-0ef2-4819-bd32-36b4865cf714","Type":"ContainerStarted","Data":"89d8bd494eee5923c5df90650d3b3d7814069fa4b712b1242b0e5b26debae775"} Apr 16 20:14:40.741372 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:40.741325 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-v98ll" podStartSLOduration=2.629094675 podStartE2EDuration="3.741311282s" podCreationTimestamp="2026-04-16 20:14:37 +0000 UTC" firstStartedPulling="2026-04-16 20:14:38.716361928 +0000 UTC m=+170.153350415" lastFinishedPulling="2026-04-16 20:14:39.828578526 +0000 UTC m=+171.265567022" observedRunningTime="2026-04-16 20:14:40.740662273 +0000 UTC m=+172.177650784" watchObservedRunningTime="2026-04-16 20:14:40.741311282 +0000 UTC m=+172.178299791" Apr 16 20:14:41.691014 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:41.690986 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nwtd6" Apr 16 20:14:43.018942 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.018908 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-g5wtx"] Apr 16 20:14:43.022459 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.022435 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.025972 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.025946 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:14:43.026139 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.025971 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:14:43.026299 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.026018 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xns2x\"" Apr 16 20:14:43.026387 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.026042 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:14:43.040698 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.040672 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-textfile\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.040818 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.040734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.040895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.040839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87150cb8-5f8b-431e-9a3e-04dc45ef494c-metrics-client-ca\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.040945 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.040901 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqwjn\" (UniqueName: \"kubernetes.io/projected/87150cb8-5f8b-431e-9a3e-04dc45ef494c-kube-api-access-fqwjn\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.041009 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.040940 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-tls\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.041009 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.041003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87150cb8-5f8b-431e-9a3e-04dc45ef494c-sys\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.041105 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.041052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-wtmp\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.041105 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.041086 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/87150cb8-5f8b-431e-9a3e-04dc45ef494c-root\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.041196 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.041139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-accelerators-collector-config\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.142148 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.142096 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87150cb8-5f8b-431e-9a3e-04dc45ef494c-metrics-client-ca\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.142750 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.142723 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqwjn\" (UniqueName: \"kubernetes.io/projected/87150cb8-5f8b-431e-9a3e-04dc45ef494c-kube-api-access-fqwjn\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.143043 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.142845 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87150cb8-5f8b-431e-9a3e-04dc45ef494c-metrics-client-ca\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.143206 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.143010 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-tls\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.143344 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.143321 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87150cb8-5f8b-431e-9a3e-04dc45ef494c-sys\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.143454 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:43.143196 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 20:14:43.143510 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:43.143464 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-tls podName:87150cb8-5f8b-431e-9a3e-04dc45ef494c nodeName:}" failed. No retries permitted until 2026-04-16 20:14:43.643426539 +0000 UTC m=+175.080415040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-tls") pod "node-exporter-g5wtx" (UID: "87150cb8-5f8b-431e-9a3e-04dc45ef494c") : secret "node-exporter-tls" not found Apr 16 20:14:43.143767 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.143737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87150cb8-5f8b-431e-9a3e-04dc45ef494c-sys\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.144278 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.144245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-wtmp\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.144383 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.144310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/87150cb8-5f8b-431e-9a3e-04dc45ef494c-root\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.144383 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.144340 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-accelerators-collector-config\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.144491 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.144393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-textfile\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.144491 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.144406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-wtmp\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.144491 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.144444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.145495 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.144973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-accelerators-collector-config\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.145495 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.145093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/87150cb8-5f8b-431e-9a3e-04dc45ef494c-root\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.145495 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.145322 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-textfile\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.147163 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.147136 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.156790 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.156762 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqwjn\" (UniqueName: \"kubernetes.io/projected/87150cb8-5f8b-431e-9a3e-04dc45ef494c-kube-api-access-fqwjn\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.646713 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.646661 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-tls\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.649279 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.649251 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/87150cb8-5f8b-431e-9a3e-04dc45ef494c-node-exporter-tls\") pod \"node-exporter-g5wtx\" (UID: \"87150cb8-5f8b-431e-9a3e-04dc45ef494c\") " pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.932457 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:43.932378 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-g5wtx" Apr 16 20:14:43.944359 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:43.944330 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87150cb8_5f8b_431e_9a3e_04dc45ef494c.slice/crio-7e6eb80909d0308998cdc290c6d468e14528877049b5787400837b13c3aaaf74 WatchSource:0}: Error finding container 7e6eb80909d0308998cdc290c6d468e14528877049b5787400837b13c3aaaf74: Status 404 returned error can't find the container with id 7e6eb80909d0308998cdc290c6d468e14528877049b5787400837b13c3aaaf74 Apr 16 20:14:44.080473 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.080436 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:14:44.085668 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.085651 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.088250 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.088228 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 20:14:44.088368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.088257 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 20:14:44.088505 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.088485 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 20:14:44.088624 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.088572 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 20:14:44.089060 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.088978 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 20:14:44.089060 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.089013 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 20:14:44.089060 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.089030 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 20:14:44.089060 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.089050 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-vqcll\"" Apr 16 20:14:44.089301 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.089169 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 20:14:44.089301 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.089182 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 20:14:44.098796 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.098776 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:14:44.151735 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.151699 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.151735 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.151737 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-web-config\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.151986 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.151805 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.151986 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.151900 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.151986 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.151963 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/47a5398c-31b3-4a80-8e04-268cd527c1f4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.152089 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.151989 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.152089 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.152034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/47a5398c-31b3-4a80-8e04-268cd527c1f4-config-out\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.152089 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.152079 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.152190 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.152116 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.152190 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.152134 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcwfm\" (UniqueName: \"kubernetes.io/projected/47a5398c-31b3-4a80-8e04-268cd527c1f4-kube-api-access-mcwfm\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.152190 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.152159 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.152190 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.152173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.152305 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.152192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-config-volume\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.252924 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.252895 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.253078 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.252941 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.253135 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.253070 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcwfm\" (UniqueName: \"kubernetes.io/projected/47a5398c-31b3-4a80-8e04-268cd527c1f4-kube-api-access-mcwfm\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.253135 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.253116 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.253299 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.253145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.253299 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.253178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-config-volume\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.253299 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.253257 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.253299 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.253285 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-web-config\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.253480 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.253313 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.253480 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.253363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.253480 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.253404 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/47a5398c-31b3-4a80-8e04-268cd527c1f4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.253480 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:14:44.253423 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-trusted-ca-bundle podName:47a5398c-31b3-4a80-8e04-268cd527c1f4 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:44.753401141 +0000 UTC m=+176.190389642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4") : configmap references non-existent config key: ca-bundle.crt Apr 16 20:14:44.253480 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.253456 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.253927 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.253903 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.254034 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.253961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/47a5398c-31b3-4a80-8e04-268cd527c1f4-config-out\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.255388 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.255343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.256070 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.256048 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.256280 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.256206 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.256472 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.256449 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-web-config\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.257032 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.257010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.257409 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.257294 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/47a5398c-31b3-4a80-8e04-268cd527c1f4-config-out\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.257910 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.257882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-config-volume\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.257986 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.257953 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.257986 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.257965 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/47a5398c-31b3-4a80-8e04-268cd527c1f4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.258656 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.258641 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.261765 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.261746 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcwfm\" (UniqueName: \"kubernetes.io/projected/47a5398c-31b3-4a80-8e04-268cd527c1f4-kube-api-access-mcwfm\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.737213 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.737144 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-g5wtx" event={"ID":"87150cb8-5f8b-431e-9a3e-04dc45ef494c","Type":"ContainerStarted","Data":"7e6eb80909d0308998cdc290c6d468e14528877049b5787400837b13c3aaaf74"} Apr 16 20:14:44.758607 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.758583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.762018 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.760362 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:44.995280 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:44.995259 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:45.145689 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:45.145657 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47a5398c_31b3_4a80_8e04_268cd527c1f4.slice/crio-8a46f573eda403fdc888acdd19ebd555e57a6c190cc5f6033d7da051b66c09fc WatchSource:0}: Error finding container 8a46f573eda403fdc888acdd19ebd555e57a6c190cc5f6033d7da051b66c09fc: Status 404 returned error can't find the container with id 8a46f573eda403fdc888acdd19ebd555e57a6c190cc5f6033d7da051b66c09fc Apr 16 20:14:45.149419 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:45.149392 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:14:45.740889 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:45.740834 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerStarted","Data":"8a46f573eda403fdc888acdd19ebd555e57a6c190cc5f6033d7da051b66c09fc"} Apr 16 20:14:45.742491 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:45.742454 2572 generic.go:358] "Generic (PLEG): container finished" podID="87150cb8-5f8b-431e-9a3e-04dc45ef494c" containerID="3a64e705537688620120ddaee7b7220144c282a0d07d45573c3174113f61de3f" exitCode=0 Apr 16 20:14:45.742659 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:45.742495 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-g5wtx" event={"ID":"87150cb8-5f8b-431e-9a3e-04dc45ef494c","Type":"ContainerDied","Data":"3a64e705537688620120ddaee7b7220144c282a0d07d45573c3174113f61de3f"} Apr 16 20:14:46.747378 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:46.747345 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-g5wtx" event={"ID":"87150cb8-5f8b-431e-9a3e-04dc45ef494c","Type":"ContainerStarted","Data":"9cb30752e9f10d9c422ebdae818121f4012e5e357321a710a5c37ea588066478"} Apr 16 20:14:46.747378 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:46.747382 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-g5wtx" event={"ID":"87150cb8-5f8b-431e-9a3e-04dc45ef494c","Type":"ContainerStarted","Data":"d5f9744f02422e01553a1f6555ae857aecd6a96d7d339210d3dd973e059dc69c"} Apr 16 20:14:46.748631 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:46.748611 2572 generic.go:358] "Generic (PLEG): container finished" podID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerID="45e4215d6d76ebf40a947a7539a86cebba4896cef826298a3712e5bfab119657" exitCode=0 Apr 16 20:14:46.748700 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:46.748650 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerDied","Data":"45e4215d6d76ebf40a947a7539a86cebba4896cef826298a3712e5bfab119657"} Apr 16 20:14:46.768814 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:46.768772 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-g5wtx" podStartSLOduration=3.971326781 podStartE2EDuration="4.768757967s" podCreationTimestamp="2026-04-16 20:14:42 +0000 UTC" firstStartedPulling="2026-04-16 20:14:43.946659425 +0000 UTC m=+175.383647911" lastFinishedPulling="2026-04-16 20:14:44.744090608 +0000 UTC m=+176.181079097" observedRunningTime="2026-04-16 20:14:46.768556375 +0000 UTC m=+178.205544895" watchObservedRunningTime="2026-04-16 20:14:46.768757967 +0000 UTC m=+178.205746474" Apr 16 20:14:47.422712 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.422678 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6b8899fd8-cjcz2"] Apr 16 20:14:47.427342 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.427313 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.429610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.429583 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 20:14:47.430303 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.430279 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 20:14:47.430427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.430309 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-ad2j4rg1j95kh\"" Apr 16 20:14:47.430427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.430388 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 20:14:47.430540 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.430528 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 20:14:47.430540 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.430534 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-w7k7x\"" Apr 16 20:14:47.437826 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.437801 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6b8899fd8-cjcz2"] Apr 16 20:14:47.482809 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.482774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/39ed6675-59d4-42e2-ab00-f407f8c1db04-secret-metrics-server-tls\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.482809 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.482808 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/39ed6675-59d4-42e2-ab00-f407f8c1db04-audit-log\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.483025 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.482828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39ed6675-59d4-42e2-ab00-f407f8c1db04-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.483025 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.482901 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/39ed6675-59d4-42e2-ab00-f407f8c1db04-secret-metrics-server-client-certs\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.483107 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.483021 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/39ed6675-59d4-42e2-ab00-f407f8c1db04-metrics-server-audit-profiles\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.483107 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.483055 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbjs5\" (UniqueName: \"kubernetes.io/projected/39ed6675-59d4-42e2-ab00-f407f8c1db04-kube-api-access-vbjs5\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.483107 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.483088 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ed6675-59d4-42e2-ab00-f407f8c1db04-client-ca-bundle\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.583927 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.583892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/39ed6675-59d4-42e2-ab00-f407f8c1db04-metrics-server-audit-profiles\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.583927 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.583933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbjs5\" (UniqueName: \"kubernetes.io/projected/39ed6675-59d4-42e2-ab00-f407f8c1db04-kube-api-access-vbjs5\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.584149 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.583955 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ed6675-59d4-42e2-ab00-f407f8c1db04-client-ca-bundle\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.584149 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.583993 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/39ed6675-59d4-42e2-ab00-f407f8c1db04-secret-metrics-server-tls\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.584149 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.584013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/39ed6675-59d4-42e2-ab00-f407f8c1db04-audit-log\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.584149 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.584033 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39ed6675-59d4-42e2-ab00-f407f8c1db04-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.584149 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.584088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/39ed6675-59d4-42e2-ab00-f407f8c1db04-secret-metrics-server-client-certs\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.584484 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.584454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/39ed6675-59d4-42e2-ab00-f407f8c1db04-audit-log\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.584987 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.584960 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39ed6675-59d4-42e2-ab00-f407f8c1db04-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.585378 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.585350 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/39ed6675-59d4-42e2-ab00-f407f8c1db04-metrics-server-audit-profiles\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.586761 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.586738 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/39ed6675-59d4-42e2-ab00-f407f8c1db04-secret-metrics-server-tls\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.586962 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.586941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ed6675-59d4-42e2-ab00-f407f8c1db04-client-ca-bundle\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.587026 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.586941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/39ed6675-59d4-42e2-ab00-f407f8c1db04-secret-metrics-server-client-certs\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.592479 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.592452 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbjs5\" (UniqueName: \"kubernetes.io/projected/39ed6675-59d4-42e2-ab00-f407f8c1db04-kube-api-access-vbjs5\") pod \"metrics-server-6b8899fd8-cjcz2\" (UID: \"39ed6675-59d4-42e2-ab00-f407f8c1db04\") " pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.740418 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.740327 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:14:47.877607 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:47.877579 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6b8899fd8-cjcz2"] Apr 16 20:14:48.154744 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:14:48.154714 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39ed6675_59d4_42e2_ab00_f407f8c1db04.slice/crio-e2c39c9471873682e7f025ff7f8e1e32793aba31b4e708f0243112dd2a269238 WatchSource:0}: Error finding container e2c39c9471873682e7f025ff7f8e1e32793aba31b4e708f0243112dd2a269238: Status 404 returned error can't find the container with id e2c39c9471873682e7f025ff7f8e1e32793aba31b4e708f0243112dd2a269238 Apr 16 20:14:48.764535 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:48.764500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerStarted","Data":"a15219b0e50e63a3403208ae22e1319d180527d2d7a55290e3623e6c760eeab4"} Apr 16 20:14:48.764535 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:48.764539 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerStarted","Data":"3b0c5439e835f8f4a73d71a32d586f3873d99bd13f8e9099a3a5fb89d8c4a476"} Apr 16 20:14:48.764765 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:48.764552 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerStarted","Data":"122ab79f926574451855730acd38c67f504f24a4d4c32a7351416e3a13253e9a"} Apr 16 20:14:48.764765 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:48.764565 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerStarted","Data":"82f398d770176f2a1bd226d97de74c0bde0c0acd9148fca7b5aa9ae0abf88bdc"} Apr 16 20:14:48.764765 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:48.764579 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerStarted","Data":"6fd5a488bce842167d02bd603ce91afa2d1b472acb25ce338d5c7324f9fe8c56"} Apr 16 20:14:48.765764 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:48.765737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" event={"ID":"39ed6675-59d4-42e2-ab00-f407f8c1db04","Type":"ContainerStarted","Data":"e2c39c9471873682e7f025ff7f8e1e32793aba31b4e708f0243112dd2a269238"} Apr 16 20:14:49.356075 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.353958 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:14:49.358818 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.358789 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.361151 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.361125 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 20:14:49.361432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.361412 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 20:14:49.361545 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.361528 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 20:14:49.361608 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.361567 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4h7pohcsqshto\"" Apr 16 20:14:49.361896 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.361852 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 20:14:49.361975 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.361933 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 20:14:49.362214 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.362177 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 20:14:49.362298 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.362240 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 20:14:49.362298 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.362202 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 20:14:49.362402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.362202 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 20:14:49.365017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.362839 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-gcm88\"" Apr 16 20:14:49.365017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.363514 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 20:14:49.367242 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.367215 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 20:14:49.379556 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.378862 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 20:14:49.382507 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.382456 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:14:49.399421 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.399381 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.399689 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.399661 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.399820 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.399706 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.399820 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.399769 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-config-out\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.399820 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.399812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.399997 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.399839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.399997 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.399895 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.399997 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.399926 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-web-config\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.399997 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.399953 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.400205 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.400034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.400205 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.400098 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.400205 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.400168 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.400336 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.400267 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.400336 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.400312 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.400428 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.400339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzmft\" (UniqueName: \"kubernetes.io/projected/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-kube-api-access-xzmft\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.400428 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.400367 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-config\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.400428 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.400419 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.401101 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.401017 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502070 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502034 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-config-out\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502070 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502072 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502293 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502090 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502293 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502112 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502293 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502142 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-web-config\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502293 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502293 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502193 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502293 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502293 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502609 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502300 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502609 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502340 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502609 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502367 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzmft\" (UniqueName: \"kubernetes.io/projected/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-kube-api-access-xzmft\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502609 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-config\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502609 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502416 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502609 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502447 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502609 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502484 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502609 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502523 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.502609 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.502554 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.503152 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.503095 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.504139 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.504109 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.504741 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.504712 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.505209 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.505170 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-config-out\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.505510 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.505485 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.505666 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.505641 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.506208 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.506182 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.506758 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.506467 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.508473 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.508150 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.508473 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.508168 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-web-config\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.509177 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.509135 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.509260 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.509161 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.509557 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.509514 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.510690 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.510664 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.511972 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.511939 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.512611 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.512204 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-config\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.512701 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.512687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.513609 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.513577 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzmft\" (UniqueName: \"kubernetes.io/projected/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-kube-api-access-xzmft\") pod \"prometheus-k8s-0\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.675172 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.675152 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:49.775796 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.775704 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerStarted","Data":"17c4cb71cb6905eddbbce5a9e869041738296bba9363402fb66afac806e4925b"} Apr 16 20:14:49.780743 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.780270 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" event={"ID":"39ed6675-59d4-42e2-ab00-f407f8c1db04","Type":"ContainerStarted","Data":"5f43fbdefb28d958a1b252ebde2067bd9ea0c4876d2ba17c5dc5f7b117544037"} Apr 16 20:14:49.808821 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.807247 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.328204202 podStartE2EDuration="5.807230291s" podCreationTimestamp="2026-04-16 20:14:44 +0000 UTC" firstStartedPulling="2026-04-16 20:14:45.147912526 +0000 UTC m=+176.584901012" lastFinishedPulling="2026-04-16 20:14:49.626938606 +0000 UTC m=+181.063927101" observedRunningTime="2026-04-16 20:14:49.80497002 +0000 UTC m=+181.241958526" watchObservedRunningTime="2026-04-16 20:14:49.807230291 +0000 UTC m=+181.244218799" Apr 16 20:14:49.827635 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.827412 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" podStartSLOduration=1.310923107 podStartE2EDuration="2.827397401s" podCreationTimestamp="2026-04-16 20:14:47 +0000 UTC" firstStartedPulling="2026-04-16 20:14:48.156657171 +0000 UTC m=+179.593645657" lastFinishedPulling="2026-04-16 20:14:49.673131463 +0000 UTC m=+181.110119951" observedRunningTime="2026-04-16 20:14:49.825509716 +0000 UTC m=+181.262498257" watchObservedRunningTime="2026-04-16 20:14:49.827397401 +0000 UTC m=+181.264385905" Apr 16 20:14:49.835151 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:49.835125 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:14:50.784503 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:50.784468 2572 generic.go:358] "Generic (PLEG): container finished" podID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerID="08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c" exitCode=0 Apr 16 20:14:50.784893 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:50.784553 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerDied","Data":"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c"} Apr 16 20:14:50.784893 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:50.784587 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerStarted","Data":"0a944920e243fc1f95b02f5ccf7e84f161ea3c9b46b3d4311e05c547f4f6709a"} Apr 16 20:14:51.687651 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:51.687619 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6676cf5bc-962mq" Apr 16 20:14:53.798864 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:53.798822 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerStarted","Data":"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca"} Apr 16 20:14:53.799347 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:53.798893 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerStarted","Data":"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a"} Apr 16 20:14:54.683721 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:54.683649 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-9455768c8-dnlpz" podUID="da80589c-6efe-43a7-bd6f-c9394f610209" containerName="registry" containerID="cri-o://bec66e737aedb07fe4df8a1d77f4fd35c3982e89cc1c2d8ae8691c50906cd823" gracePeriod=30 Apr 16 20:14:54.803213 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:54.803168 2572 generic.go:358] "Generic (PLEG): container finished" podID="da80589c-6efe-43a7-bd6f-c9394f610209" containerID="bec66e737aedb07fe4df8a1d77f4fd35c3982e89cc1c2d8ae8691c50906cd823" exitCode=0 Apr 16 20:14:54.803516 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:54.803236 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9455768c8-dnlpz" event={"ID":"da80589c-6efe-43a7-bd6f-c9394f610209","Type":"ContainerDied","Data":"bec66e737aedb07fe4df8a1d77f4fd35c3982e89cc1c2d8ae8691c50906cd823"} Apr 16 20:14:55.506315 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.506288 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:55.572058 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.572035 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-bound-sa-token\") pod \"da80589c-6efe-43a7-bd6f-c9394f610209\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " Apr 16 20:14:55.572160 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.572069 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls\") pod \"da80589c-6efe-43a7-bd6f-c9394f610209\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " Apr 16 20:14:55.572160 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.572088 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da80589c-6efe-43a7-bd6f-c9394f610209-trusted-ca\") pod \"da80589c-6efe-43a7-bd6f-c9394f610209\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " Apr 16 20:14:55.572160 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.572110 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsrxw\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-kube-api-access-qsrxw\") pod \"da80589c-6efe-43a7-bd6f-c9394f610209\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " Apr 16 20:14:55.572160 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.572140 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da80589c-6efe-43a7-bd6f-c9394f610209-installation-pull-secrets\") pod \"da80589c-6efe-43a7-bd6f-c9394f610209\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " Apr 16 20:14:55.572369 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.572209 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/da80589c-6efe-43a7-bd6f-c9394f610209-image-registry-private-configuration\") pod \"da80589c-6efe-43a7-bd6f-c9394f610209\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " Apr 16 20:14:55.572369 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.572263 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da80589c-6efe-43a7-bd6f-c9394f610209-registry-certificates\") pod \"da80589c-6efe-43a7-bd6f-c9394f610209\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " Apr 16 20:14:55.572369 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.572287 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da80589c-6efe-43a7-bd6f-c9394f610209-ca-trust-extracted\") pod \"da80589c-6efe-43a7-bd6f-c9394f610209\" (UID: \"da80589c-6efe-43a7-bd6f-c9394f610209\") " Apr 16 20:14:55.572615 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.572583 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da80589c-6efe-43a7-bd6f-c9394f610209-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "da80589c-6efe-43a7-bd6f-c9394f610209" (UID: "da80589c-6efe-43a7-bd6f-c9394f610209"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:55.573002 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.572934 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da80589c-6efe-43a7-bd6f-c9394f610209-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "da80589c-6efe-43a7-bd6f-c9394f610209" (UID: "da80589c-6efe-43a7-bd6f-c9394f610209"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:55.574434 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.574407 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "da80589c-6efe-43a7-bd6f-c9394f610209" (UID: "da80589c-6efe-43a7-bd6f-c9394f610209"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:55.574632 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.574602 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-kube-api-access-qsrxw" (OuterVolumeSpecName: "kube-api-access-qsrxw") pod "da80589c-6efe-43a7-bd6f-c9394f610209" (UID: "da80589c-6efe-43a7-bd6f-c9394f610209"). InnerVolumeSpecName "kube-api-access-qsrxw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:55.574725 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.574657 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da80589c-6efe-43a7-bd6f-c9394f610209-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "da80589c-6efe-43a7-bd6f-c9394f610209" (UID: "da80589c-6efe-43a7-bd6f-c9394f610209"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:55.574833 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.574804 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da80589c-6efe-43a7-bd6f-c9394f610209-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "da80589c-6efe-43a7-bd6f-c9394f610209" (UID: "da80589c-6efe-43a7-bd6f-c9394f610209"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:55.574954 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.574889 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "da80589c-6efe-43a7-bd6f-c9394f610209" (UID: "da80589c-6efe-43a7-bd6f-c9394f610209"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:55.583381 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.583238 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da80589c-6efe-43a7-bd6f-c9394f610209-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "da80589c-6efe-43a7-bd6f-c9394f610209" (UID: "da80589c-6efe-43a7-bd6f-c9394f610209"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:55.673210 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.673075 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-bound-sa-token\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:14:55.673210 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.673104 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-registry-tls\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:14:55.673210 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.673118 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da80589c-6efe-43a7-bd6f-c9394f610209-trusted-ca\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:14:55.673210 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.673132 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qsrxw\" (UniqueName: \"kubernetes.io/projected/da80589c-6efe-43a7-bd6f-c9394f610209-kube-api-access-qsrxw\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:14:55.673210 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.673147 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da80589c-6efe-43a7-bd6f-c9394f610209-installation-pull-secrets\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:14:55.673210 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.673161 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/da80589c-6efe-43a7-bd6f-c9394f610209-image-registry-private-configuration\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:14:55.673210 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.673175 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da80589c-6efe-43a7-bd6f-c9394f610209-registry-certificates\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:14:55.673210 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.673188 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da80589c-6efe-43a7-bd6f-c9394f610209-ca-trust-extracted\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:14:55.806765 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.806740 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9455768c8-dnlpz" Apr 16 20:14:55.806765 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.806740 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9455768c8-dnlpz" event={"ID":"da80589c-6efe-43a7-bd6f-c9394f610209","Type":"ContainerDied","Data":"cfc07d1fd8c41fcfbd0c6b003953324e5d454b4ca9e46768ee1dfa32e7cf6a3e"} Apr 16 20:14:55.807181 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.806780 2572 scope.go:117] "RemoveContainer" containerID="bec66e737aedb07fe4df8a1d77f4fd35c3982e89cc1c2d8ae8691c50906cd823" Apr 16 20:14:55.809586 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.809563 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerStarted","Data":"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4"} Apr 16 20:14:55.809745 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.809702 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerStarted","Data":"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175"} Apr 16 20:14:55.809745 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.809715 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerStarted","Data":"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44"} Apr 16 20:14:55.832155 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.832124 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9455768c8-dnlpz"] Apr 16 20:14:55.835446 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:55.835418 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-9455768c8-dnlpz"] Apr 16 20:14:56.815604 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:56.815574 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerStarted","Data":"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67"} Apr 16 20:14:56.846148 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:56.846089 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.088700858 podStartE2EDuration="7.846074081s" podCreationTimestamp="2026-04-16 20:14:49 +0000 UTC" firstStartedPulling="2026-04-16 20:14:50.785647064 +0000 UTC m=+182.222635551" lastFinishedPulling="2026-04-16 20:14:55.543020288 +0000 UTC m=+186.980008774" observedRunningTime="2026-04-16 20:14:56.845260487 +0000 UTC m=+188.282248998" watchObservedRunningTime="2026-04-16 20:14:56.846074081 +0000 UTC m=+188.283062590" Apr 16 20:14:57.148539 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:57.148462 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da80589c-6efe-43a7-bd6f-c9394f610209" path="/var/lib/kubelet/pods/da80589c-6efe-43a7-bd6f-c9394f610209/volumes" Apr 16 20:14:59.675559 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:14:59.675522 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:07.741294 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:07.741256 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:15:07.741294 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:07.741294 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:15:15.681097 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:15.681069 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_47a5398c-31b3-4a80-8e04-268cd527c1f4/init-config-reloader/0.log" Apr 16 20:15:15.688254 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:15.688233 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_47a5398c-31b3-4a80-8e04-268cd527c1f4/alertmanager/0.log" Apr 16 20:15:15.781983 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:15.781958 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_47a5398c-31b3-4a80-8e04-268cd527c1f4/config-reloader/0.log" Apr 16 20:15:15.982834 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:15.982760 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_47a5398c-31b3-4a80-8e04-268cd527c1f4/kube-rbac-proxy-web/0.log" Apr 16 20:15:16.184521 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:16.184482 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_47a5398c-31b3-4a80-8e04-268cd527c1f4/kube-rbac-proxy/0.log" Apr 16 20:15:16.386919 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:16.386866 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_47a5398c-31b3-4a80-8e04-268cd527c1f4/kube-rbac-proxy-metric/0.log" Apr 16 20:15:16.583432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:16.583407 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_47a5398c-31b3-4a80-8e04-268cd527c1f4/prom-label-proxy/0.log" Apr 16 20:15:16.782767 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:16.782736 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-tc597_611adbad-0f1f-4c66-8f17-17f5a789a903/cluster-monitoring-operator/0.log" Apr 16 20:15:17.582544 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:17.582511 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6b8899fd8-cjcz2_39ed6675-59d4-42e2-ab00-f407f8c1db04/metrics-server/0.log" Apr 16 20:15:17.981976 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:17.981907 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-g5wtx_87150cb8-5f8b-431e-9a3e-04dc45ef494c/init-textfile/0.log" Apr 16 20:15:18.182725 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:18.182698 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-g5wtx_87150cb8-5f8b-431e-9a3e-04dc45ef494c/node-exporter/0.log" Apr 16 20:15:18.382362 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:18.382332 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-g5wtx_87150cb8-5f8b-431e-9a3e-04dc45ef494c/kube-rbac-proxy/0.log" Apr 16 20:15:20.383371 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:20.383341 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70/init-config-reloader/0.log" Apr 16 20:15:20.583799 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:20.583763 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70/prometheus/0.log" Apr 16 20:15:20.782634 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:20.782602 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70/config-reloader/0.log" Apr 16 20:15:20.982260 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:20.982232 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70/thanos-sidecar/0.log" Apr 16 20:15:21.181752 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:21.181678 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70/kube-rbac-proxy-web/0.log" Apr 16 20:15:21.381830 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:21.381804 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70/kube-rbac-proxy/0.log" Apr 16 20:15:21.581704 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:21.581678 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70/kube-rbac-proxy-thanos/0.log" Apr 16 20:15:21.783402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:21.783370 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-v98ll_37613547-0ef2-4819-bd32-36b4865cf714/prometheus-operator/0.log" Apr 16 20:15:21.982683 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:21.982607 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-v98ll_37613547-0ef2-4819-bd32-36b4865cf714/kube-rbac-proxy/0.log" Apr 16 20:15:22.182057 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:22.182027 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-6rfzr_3958a2b6-30b2-4633-b471-4e059b8de73a/prometheus-operator-admission-webhook/0.log" Apr 16 20:15:23.891563 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:23.891526 2572 generic.go:358] "Generic (PLEG): container finished" podID="280734e1-0b9c-4cc5-9274-f8058780a728" containerID="69581c307ba438089ff13af1d10e7ed24f7585743625fb82bdbc948bea45e9c4" exitCode=0 Apr 16 20:15:23.891952 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:23.891602 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" event={"ID":"280734e1-0b9c-4cc5-9274-f8058780a728","Type":"ContainerDied","Data":"69581c307ba438089ff13af1d10e7ed24f7585743625fb82bdbc948bea45e9c4"} Apr 16 20:15:23.891952 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:23.891935 2572 scope.go:117] "RemoveContainer" containerID="69581c307ba438089ff13af1d10e7ed24f7585743625fb82bdbc948bea45e9c4" Apr 16 20:15:24.182049 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:24.182010 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-9gbgs_8358b0b1-4c45-4118-bd03-a851a409b99e/networking-console-plugin/0.log" Apr 16 20:15:24.895483 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:24.895450 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tbxfb" event={"ID":"280734e1-0b9c-4cc5-9274-f8058780a728","Type":"ContainerStarted","Data":"97b723b34f839a3609a4183c7d0b4a844da886a04828be4cdfa255c5a747243c"} Apr 16 20:15:27.755914 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:27.755861 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:15:27.759795 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:27.759768 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6b8899fd8-cjcz2" Apr 16 20:15:33.926024 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:33.925994 2572 generic.go:358] "Generic (PLEG): container finished" podID="b6b38707-68a6-4045-bbe5-f614a88439b1" containerID="6d85974e2faa052248bd38e049bb96138ed9d00ff39a441c69a65c229a96316e" exitCode=0 Apr 16 20:15:33.926420 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:33.926064 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" event={"ID":"b6b38707-68a6-4045-bbe5-f614a88439b1","Type":"ContainerDied","Data":"6d85974e2faa052248bd38e049bb96138ed9d00ff39a441c69a65c229a96316e"} Apr 16 20:15:33.926420 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:33.926327 2572 scope.go:117] "RemoveContainer" containerID="6d85974e2faa052248bd38e049bb96138ed9d00ff39a441c69a65c229a96316e" Apr 16 20:15:34.930954 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:34.930916 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n994t" event={"ID":"b6b38707-68a6-4045-bbe5-f614a88439b1","Type":"ContainerStarted","Data":"1e7f73ec16f66abb215b9a42f545451059656b5b27da1b60abd31562f02d712a"} Apr 16 20:15:49.676090 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:49.676055 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:49.695239 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:49.695208 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:49.992645 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:49.992559 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:59.834820 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:59.834735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:15:59.837156 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:59.837130 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8e680955-60f7-4aaf-9aeb-b5efc9759ed4-metrics-certs\") pod \"network-metrics-daemon-5bd8l\" (UID: \"8e680955-60f7-4aaf-9aeb-b5efc9759ed4\") " pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:15:59.948183 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:59.948159 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8662n\"" Apr 16 20:15:59.956216 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:15:59.956195 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bd8l" Apr 16 20:16:00.094490 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:00.094413 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5bd8l"] Apr 16 20:16:00.097681 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:16:00.097652 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e680955_60f7_4aaf_9aeb_b5efc9759ed4.slice/crio-b52d182abb88750346663b911056313ff022c34fa2f8ca361f779ad7f91be58d WatchSource:0}: Error finding container b52d182abb88750346663b911056313ff022c34fa2f8ca361f779ad7f91be58d: Status 404 returned error can't find the container with id b52d182abb88750346663b911056313ff022c34fa2f8ca361f779ad7f91be58d Apr 16 20:16:01.007823 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:01.007784 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bd8l" event={"ID":"8e680955-60f7-4aaf-9aeb-b5efc9759ed4","Type":"ContainerStarted","Data":"b52d182abb88750346663b911056313ff022c34fa2f8ca361f779ad7f91be58d"} Apr 16 20:16:02.016294 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:02.016253 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bd8l" event={"ID":"8e680955-60f7-4aaf-9aeb-b5efc9759ed4","Type":"ContainerStarted","Data":"e0ccf8027cbfd0ee6ef228ced5cb38e122d8b324c9d852f6aa11b7b05df4872c"} Apr 16 20:16:02.016294 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:02.016292 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bd8l" event={"ID":"8e680955-60f7-4aaf-9aeb-b5efc9759ed4","Type":"ContainerStarted","Data":"d49d43ab0214daf850cf8ba4d04ef2e40f63af02c3e251bba744ad95079ba12b"} Apr 16 20:16:02.032566 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:02.032517 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5bd8l" podStartSLOduration=252.159165338 podStartE2EDuration="4m13.03250349s" podCreationTimestamp="2026-04-16 20:11:49 +0000 UTC" firstStartedPulling="2026-04-16 20:16:00.099369146 +0000 UTC m=+251.536357632" lastFinishedPulling="2026-04-16 20:16:00.972707294 +0000 UTC m=+252.409695784" observedRunningTime="2026-04-16 20:16:02.031443045 +0000 UTC m=+253.468431558" watchObservedRunningTime="2026-04-16 20:16:02.03250349 +0000 UTC m=+253.469492031" Apr 16 20:16:03.473987 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:03.473949 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:16:03.474587 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:03.474525 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="alertmanager" containerID="cri-o://6fd5a488bce842167d02bd603ce91afa2d1b472acb25ce338d5c7324f9fe8c56" gracePeriod=120 Apr 16 20:16:03.474729 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:03.474602 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="kube-rbac-proxy-metric" containerID="cri-o://a15219b0e50e63a3403208ae22e1319d180527d2d7a55290e3623e6c760eeab4" gracePeriod=120 Apr 16 20:16:03.474729 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:03.474616 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="prom-label-proxy" containerID="cri-o://17c4cb71cb6905eddbbce5a9e869041738296bba9363402fb66afac806e4925b" gracePeriod=120 Apr 16 20:16:03.474729 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:03.474674 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="config-reloader" containerID="cri-o://82f398d770176f2a1bd226d97de74c0bde0c0acd9148fca7b5aa9ae0abf88bdc" gracePeriod=120 Apr 16 20:16:03.474729 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:03.474672 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="kube-rbac-proxy-web" containerID="cri-o://122ab79f926574451855730acd38c67f504f24a4d4c32a7351416e3a13253e9a" gracePeriod=120 Apr 16 20:16:03.474976 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:03.474740 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="kube-rbac-proxy" containerID="cri-o://3b0c5439e835f8f4a73d71a32d586f3873d99bd13f8e9099a3a5fb89d8c4a476" gracePeriod=120 Apr 16 20:16:04.025171 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.025134 2572 generic.go:358] "Generic (PLEG): container finished" podID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerID="17c4cb71cb6905eddbbce5a9e869041738296bba9363402fb66afac806e4925b" exitCode=0 Apr 16 20:16:04.025171 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.025161 2572 generic.go:358] "Generic (PLEG): container finished" podID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerID="a15219b0e50e63a3403208ae22e1319d180527d2d7a55290e3623e6c760eeab4" exitCode=0 Apr 16 20:16:04.025171 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.025169 2572 generic.go:358] "Generic (PLEG): container finished" podID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerID="3b0c5439e835f8f4a73d71a32d586f3873d99bd13f8e9099a3a5fb89d8c4a476" exitCode=0 Apr 16 20:16:04.025171 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.025176 2572 generic.go:358] "Generic (PLEG): container finished" podID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerID="82f398d770176f2a1bd226d97de74c0bde0c0acd9148fca7b5aa9ae0abf88bdc" exitCode=0 Apr 16 20:16:04.025171 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.025181 2572 generic.go:358] "Generic (PLEG): container finished" podID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerID="6fd5a488bce842167d02bd603ce91afa2d1b472acb25ce338d5c7324f9fe8c56" exitCode=0 Apr 16 20:16:04.025456 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.025204 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerDied","Data":"17c4cb71cb6905eddbbce5a9e869041738296bba9363402fb66afac806e4925b"} Apr 16 20:16:04.025456 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.025237 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerDied","Data":"a15219b0e50e63a3403208ae22e1319d180527d2d7a55290e3623e6c760eeab4"} Apr 16 20:16:04.025456 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.025249 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerDied","Data":"3b0c5439e835f8f4a73d71a32d586f3873d99bd13f8e9099a3a5fb89d8c4a476"} Apr 16 20:16:04.025456 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.025258 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerDied","Data":"82f398d770176f2a1bd226d97de74c0bde0c0acd9148fca7b5aa9ae0abf88bdc"} Apr 16 20:16:04.025456 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.025266 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerDied","Data":"6fd5a488bce842167d02bd603ce91afa2d1b472acb25ce338d5c7324f9fe8c56"} Apr 16 20:16:04.718190 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.718168 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:04.778346 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.778314 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-cluster-tls-config\") pod \"47a5398c-31b3-4a80-8e04-268cd527c1f4\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " Apr 16 20:16:04.778500 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.778353 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-config-volume\") pod \"47a5398c-31b3-4a80-8e04-268cd527c1f4\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " Apr 16 20:16:04.778500 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.778389 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-web-config\") pod \"47a5398c-31b3-4a80-8e04-268cd527c1f4\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " Apr 16 20:16:04.778500 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.778411 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-metrics-client-ca\") pod \"47a5398c-31b3-4a80-8e04-268cd527c1f4\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " Apr 16 20:16:04.778500 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.778441 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-main-tls\") pod \"47a5398c-31b3-4a80-8e04-268cd527c1f4\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " Apr 16 20:16:04.778500 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.778470 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy\") pod \"47a5398c-31b3-4a80-8e04-268cd527c1f4\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " Apr 16 20:16:04.778777 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.778499 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/47a5398c-31b3-4a80-8e04-268cd527c1f4-tls-assets\") pod \"47a5398c-31b3-4a80-8e04-268cd527c1f4\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " Apr 16 20:16:04.778831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.778816 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-trusted-ca-bundle\") pod \"47a5398c-31b3-4a80-8e04-268cd527c1f4\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " Apr 16 20:16:04.778901 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.778827 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "47a5398c-31b3-4a80-8e04-268cd527c1f4" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:04.778966 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.778898 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"47a5398c-31b3-4a80-8e04-268cd527c1f4\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " Apr 16 20:16:04.778966 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.778933 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcwfm\" (UniqueName: \"kubernetes.io/projected/47a5398c-31b3-4a80-8e04-268cd527c1f4-kube-api-access-mcwfm\") pod \"47a5398c-31b3-4a80-8e04-268cd527c1f4\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " Apr 16 20:16:04.779065 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.778980 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/47a5398c-31b3-4a80-8e04-268cd527c1f4-config-out\") pod \"47a5398c-31b3-4a80-8e04-268cd527c1f4\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " Apr 16 20:16:04.779113 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.779070 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-main-db\") pod \"47a5398c-31b3-4a80-8e04-268cd527c1f4\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " Apr 16 20:16:04.779164 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.779120 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy-web\") pod \"47a5398c-31b3-4a80-8e04-268cd527c1f4\" (UID: \"47a5398c-31b3-4a80-8e04-268cd527c1f4\") " Apr 16 20:16:04.779465 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.779277 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "47a5398c-31b3-4a80-8e04-268cd527c1f4" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:04.779465 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.779410 2572 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-metrics-client-ca\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.779465 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.779430 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.779797 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.779775 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "47a5398c-31b3-4a80-8e04-268cd527c1f4" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:04.781699 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.781662 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-config-volume" (OuterVolumeSpecName: "config-volume") pod "47a5398c-31b3-4a80-8e04-268cd527c1f4" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:04.782106 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.782076 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "47a5398c-31b3-4a80-8e04-268cd527c1f4" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:04.782193 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.782174 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "47a5398c-31b3-4a80-8e04-268cd527c1f4" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:04.783157 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.783118 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47a5398c-31b3-4a80-8e04-268cd527c1f4-config-out" (OuterVolumeSpecName: "config-out") pod "47a5398c-31b3-4a80-8e04-268cd527c1f4" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:04.783415 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.783376 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "47a5398c-31b3-4a80-8e04-268cd527c1f4" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:04.783504 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.783407 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a5398c-31b3-4a80-8e04-268cd527c1f4-kube-api-access-mcwfm" (OuterVolumeSpecName: "kube-api-access-mcwfm") pod "47a5398c-31b3-4a80-8e04-268cd527c1f4" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4"). InnerVolumeSpecName "kube-api-access-mcwfm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:04.783504 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.783480 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a5398c-31b3-4a80-8e04-268cd527c1f4-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "47a5398c-31b3-4a80-8e04-268cd527c1f4" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:04.783890 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.783846 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "47a5398c-31b3-4a80-8e04-268cd527c1f4" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:04.786447 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.786421 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "47a5398c-31b3-4a80-8e04-268cd527c1f4" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:04.793209 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.793183 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-web-config" (OuterVolumeSpecName: "web-config") pod "47a5398c-31b3-4a80-8e04-268cd527c1f4" (UID: "47a5398c-31b3-4a80-8e04-268cd527c1f4"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:04.880465 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.880373 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.880465 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.880415 2572 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-cluster-tls-config\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.880465 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.880429 2572 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-config-volume\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.880465 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.880443 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-web-config\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.880465 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.880456 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-main-tls\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.880465 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.880469 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.880752 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.880484 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/47a5398c-31b3-4a80-8e04-268cd527c1f4-tls-assets\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.880752 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.880496 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/47a5398c-31b3-4a80-8e04-268cd527c1f4-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.880752 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.880509 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mcwfm\" (UniqueName: \"kubernetes.io/projected/47a5398c-31b3-4a80-8e04-268cd527c1f4-kube-api-access-mcwfm\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.880752 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.880522 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/47a5398c-31b3-4a80-8e04-268cd527c1f4-config-out\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.880752 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:04.880534 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/47a5398c-31b3-4a80-8e04-268cd527c1f4-alertmanager-main-db\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:05.030433 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.030403 2572 generic.go:358] "Generic (PLEG): container finished" podID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerID="122ab79f926574451855730acd38c67f504f24a4d4c32a7351416e3a13253e9a" exitCode=0 Apr 16 20:16:05.030571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.030481 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerDied","Data":"122ab79f926574451855730acd38c67f504f24a4d4c32a7351416e3a13253e9a"} Apr 16 20:16:05.030571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.030508 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.030571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.030524 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"47a5398c-31b3-4a80-8e04-268cd527c1f4","Type":"ContainerDied","Data":"8a46f573eda403fdc888acdd19ebd555e57a6c190cc5f6033d7da051b66c09fc"} Apr 16 20:16:05.030571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.030541 2572 scope.go:117] "RemoveContainer" containerID="17c4cb71cb6905eddbbce5a9e869041738296bba9363402fb66afac806e4925b" Apr 16 20:16:05.037543 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.037518 2572 scope.go:117] "RemoveContainer" containerID="a15219b0e50e63a3403208ae22e1319d180527d2d7a55290e3623e6c760eeab4" Apr 16 20:16:05.044010 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.043994 2572 scope.go:117] "RemoveContainer" containerID="3b0c5439e835f8f4a73d71a32d586f3873d99bd13f8e9099a3a5fb89d8c4a476" Apr 16 20:16:05.050127 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.050108 2572 scope.go:117] "RemoveContainer" containerID="122ab79f926574451855730acd38c67f504f24a4d4c32a7351416e3a13253e9a" Apr 16 20:16:05.054236 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.054214 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:16:05.057597 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.057554 2572 scope.go:117] "RemoveContainer" containerID="82f398d770176f2a1bd226d97de74c0bde0c0acd9148fca7b5aa9ae0abf88bdc" Apr 16 20:16:05.059095 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.059075 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:16:05.065267 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.065251 2572 scope.go:117] "RemoveContainer" containerID="6fd5a488bce842167d02bd603ce91afa2d1b472acb25ce338d5c7324f9fe8c56" Apr 16 20:16:05.071543 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.071523 2572 scope.go:117] "RemoveContainer" containerID="45e4215d6d76ebf40a947a7539a86cebba4896cef826298a3712e5bfab119657" Apr 16 20:16:05.077995 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.077978 2572 scope.go:117] "RemoveContainer" containerID="17c4cb71cb6905eddbbce5a9e869041738296bba9363402fb66afac806e4925b" Apr 16 20:16:05.078254 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:05.078234 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17c4cb71cb6905eddbbce5a9e869041738296bba9363402fb66afac806e4925b\": container with ID starting with 17c4cb71cb6905eddbbce5a9e869041738296bba9363402fb66afac806e4925b not found: ID does not exist" containerID="17c4cb71cb6905eddbbce5a9e869041738296bba9363402fb66afac806e4925b" Apr 16 20:16:05.078301 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.078262 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17c4cb71cb6905eddbbce5a9e869041738296bba9363402fb66afac806e4925b"} err="failed to get container status \"17c4cb71cb6905eddbbce5a9e869041738296bba9363402fb66afac806e4925b\": rpc error: code = NotFound desc = could not find container \"17c4cb71cb6905eddbbce5a9e869041738296bba9363402fb66afac806e4925b\": container with ID starting with 17c4cb71cb6905eddbbce5a9e869041738296bba9363402fb66afac806e4925b not found: ID does not exist" Apr 16 20:16:05.078301 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.078295 2572 scope.go:117] "RemoveContainer" containerID="a15219b0e50e63a3403208ae22e1319d180527d2d7a55290e3623e6c760eeab4" Apr 16 20:16:05.078517 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:05.078500 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a15219b0e50e63a3403208ae22e1319d180527d2d7a55290e3623e6c760eeab4\": container with ID starting with a15219b0e50e63a3403208ae22e1319d180527d2d7a55290e3623e6c760eeab4 not found: ID does not exist" containerID="a15219b0e50e63a3403208ae22e1319d180527d2d7a55290e3623e6c760eeab4" Apr 16 20:16:05.078571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.078522 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a15219b0e50e63a3403208ae22e1319d180527d2d7a55290e3623e6c760eeab4"} err="failed to get container status \"a15219b0e50e63a3403208ae22e1319d180527d2d7a55290e3623e6c760eeab4\": rpc error: code = NotFound desc = could not find container \"a15219b0e50e63a3403208ae22e1319d180527d2d7a55290e3623e6c760eeab4\": container with ID starting with a15219b0e50e63a3403208ae22e1319d180527d2d7a55290e3623e6c760eeab4 not found: ID does not exist" Apr 16 20:16:05.078571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.078541 2572 scope.go:117] "RemoveContainer" containerID="3b0c5439e835f8f4a73d71a32d586f3873d99bd13f8e9099a3a5fb89d8c4a476" Apr 16 20:16:05.078762 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:05.078746 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b0c5439e835f8f4a73d71a32d586f3873d99bd13f8e9099a3a5fb89d8c4a476\": container with ID starting with 3b0c5439e835f8f4a73d71a32d586f3873d99bd13f8e9099a3a5fb89d8c4a476 not found: ID does not exist" containerID="3b0c5439e835f8f4a73d71a32d586f3873d99bd13f8e9099a3a5fb89d8c4a476" Apr 16 20:16:05.078804 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.078766 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b0c5439e835f8f4a73d71a32d586f3873d99bd13f8e9099a3a5fb89d8c4a476"} err="failed to get container status \"3b0c5439e835f8f4a73d71a32d586f3873d99bd13f8e9099a3a5fb89d8c4a476\": rpc error: code = NotFound desc = could not find container \"3b0c5439e835f8f4a73d71a32d586f3873d99bd13f8e9099a3a5fb89d8c4a476\": container with ID starting with 3b0c5439e835f8f4a73d71a32d586f3873d99bd13f8e9099a3a5fb89d8c4a476 not found: ID does not exist" Apr 16 20:16:05.078804 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.078778 2572 scope.go:117] "RemoveContainer" containerID="122ab79f926574451855730acd38c67f504f24a4d4c32a7351416e3a13253e9a" Apr 16 20:16:05.078997 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:05.078978 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"122ab79f926574451855730acd38c67f504f24a4d4c32a7351416e3a13253e9a\": container with ID starting with 122ab79f926574451855730acd38c67f504f24a4d4c32a7351416e3a13253e9a not found: ID does not exist" containerID="122ab79f926574451855730acd38c67f504f24a4d4c32a7351416e3a13253e9a" Apr 16 20:16:05.079061 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.079002 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"122ab79f926574451855730acd38c67f504f24a4d4c32a7351416e3a13253e9a"} err="failed to get container status \"122ab79f926574451855730acd38c67f504f24a4d4c32a7351416e3a13253e9a\": rpc error: code = NotFound desc = could not find container \"122ab79f926574451855730acd38c67f504f24a4d4c32a7351416e3a13253e9a\": container with ID starting with 122ab79f926574451855730acd38c67f504f24a4d4c32a7351416e3a13253e9a not found: ID does not exist" Apr 16 20:16:05.079061 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.079016 2572 scope.go:117] "RemoveContainer" containerID="82f398d770176f2a1bd226d97de74c0bde0c0acd9148fca7b5aa9ae0abf88bdc" Apr 16 20:16:05.079202 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:05.079184 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82f398d770176f2a1bd226d97de74c0bde0c0acd9148fca7b5aa9ae0abf88bdc\": container with ID starting with 82f398d770176f2a1bd226d97de74c0bde0c0acd9148fca7b5aa9ae0abf88bdc not found: ID does not exist" containerID="82f398d770176f2a1bd226d97de74c0bde0c0acd9148fca7b5aa9ae0abf88bdc" Apr 16 20:16:05.079237 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.079205 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f398d770176f2a1bd226d97de74c0bde0c0acd9148fca7b5aa9ae0abf88bdc"} err="failed to get container status \"82f398d770176f2a1bd226d97de74c0bde0c0acd9148fca7b5aa9ae0abf88bdc\": rpc error: code = NotFound desc = could not find container \"82f398d770176f2a1bd226d97de74c0bde0c0acd9148fca7b5aa9ae0abf88bdc\": container with ID starting with 82f398d770176f2a1bd226d97de74c0bde0c0acd9148fca7b5aa9ae0abf88bdc not found: ID does not exist" Apr 16 20:16:05.079237 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.079219 2572 scope.go:117] "RemoveContainer" containerID="6fd5a488bce842167d02bd603ce91afa2d1b472acb25ce338d5c7324f9fe8c56" Apr 16 20:16:05.079403 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:05.079383 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd5a488bce842167d02bd603ce91afa2d1b472acb25ce338d5c7324f9fe8c56\": container with ID starting with 6fd5a488bce842167d02bd603ce91afa2d1b472acb25ce338d5c7324f9fe8c56 not found: ID does not exist" containerID="6fd5a488bce842167d02bd603ce91afa2d1b472acb25ce338d5c7324f9fe8c56" Apr 16 20:16:05.079439 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.079410 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd5a488bce842167d02bd603ce91afa2d1b472acb25ce338d5c7324f9fe8c56"} err="failed to get container status \"6fd5a488bce842167d02bd603ce91afa2d1b472acb25ce338d5c7324f9fe8c56\": rpc error: code = NotFound desc = could not find container \"6fd5a488bce842167d02bd603ce91afa2d1b472acb25ce338d5c7324f9fe8c56\": container with ID starting with 6fd5a488bce842167d02bd603ce91afa2d1b472acb25ce338d5c7324f9fe8c56 not found: ID does not exist" Apr 16 20:16:05.079439 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.079424 2572 scope.go:117] "RemoveContainer" containerID="45e4215d6d76ebf40a947a7539a86cebba4896cef826298a3712e5bfab119657" Apr 16 20:16:05.079600 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:05.079584 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45e4215d6d76ebf40a947a7539a86cebba4896cef826298a3712e5bfab119657\": container with ID starting with 45e4215d6d76ebf40a947a7539a86cebba4896cef826298a3712e5bfab119657 not found: ID does not exist" containerID="45e4215d6d76ebf40a947a7539a86cebba4896cef826298a3712e5bfab119657" Apr 16 20:16:05.079634 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.079604 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e4215d6d76ebf40a947a7539a86cebba4896cef826298a3712e5bfab119657"} err="failed to get container status \"45e4215d6d76ebf40a947a7539a86cebba4896cef826298a3712e5bfab119657\": rpc error: code = NotFound desc = could not find container \"45e4215d6d76ebf40a947a7539a86cebba4896cef826298a3712e5bfab119657\": container with ID starting with 45e4215d6d76ebf40a947a7539a86cebba4896cef826298a3712e5bfab119657 not found: ID does not exist" Apr 16 20:16:05.084705 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.084652 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:16:05.084974 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.084961 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="kube-rbac-proxy" Apr 16 20:16:05.084974 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.084976 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="kube-rbac-proxy" Apr 16 20:16:05.085069 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.084992 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="prom-label-proxy" Apr 16 20:16:05.085069 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085016 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="prom-label-proxy" Apr 16 20:16:05.085069 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085024 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da80589c-6efe-43a7-bd6f-c9394f610209" containerName="registry" Apr 16 20:16:05.085069 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085030 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="da80589c-6efe-43a7-bd6f-c9394f610209" containerName="registry" Apr 16 20:16:05.085069 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085038 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="kube-rbac-proxy-web" Apr 16 20:16:05.085069 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085043 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="kube-rbac-proxy-web" Apr 16 20:16:05.085069 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085050 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="config-reloader" Apr 16 20:16:05.085069 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085056 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="config-reloader" Apr 16 20:16:05.085069 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085063 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="init-config-reloader" Apr 16 20:16:05.085069 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085068 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="init-config-reloader" Apr 16 20:16:05.085069 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085073 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="alertmanager" Apr 16 20:16:05.085406 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085078 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="alertmanager" Apr 16 20:16:05.085406 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085084 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="kube-rbac-proxy-metric" Apr 16 20:16:05.085406 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085089 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="kube-rbac-proxy-metric" Apr 16 20:16:05.085406 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085136 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="config-reloader" Apr 16 20:16:05.085406 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085145 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="alertmanager" Apr 16 20:16:05.085406 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085151 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="kube-rbac-proxy-web" Apr 16 20:16:05.085406 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085158 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="da80589c-6efe-43a7-bd6f-c9394f610209" containerName="registry" Apr 16 20:16:05.085406 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085164 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="kube-rbac-proxy" Apr 16 20:16:05.085406 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085171 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="kube-rbac-proxy-metric" Apr 16 20:16:05.085406 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.085180 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" containerName="prom-label-proxy" Apr 16 20:16:05.090059 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.090042 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.093499 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.093476 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 20:16:05.093597 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.093517 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 20:16:05.093597 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.093528 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 20:16:05.093597 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.093552 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-vqcll\"" Apr 16 20:16:05.093789 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.093774 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 20:16:05.093862 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.093820 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 20:16:05.093947 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.093901 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 20:16:05.094006 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.093976 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 20:16:05.094316 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.094299 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 20:16:05.103555 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.102463 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 20:16:05.103555 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.102942 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:16:05.149321 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.149251 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47a5398c-31b3-4a80-8e04-268cd527c1f4" path="/var/lib/kubelet/pods/47a5398c-31b3-4a80-8e04-268cd527c1f4/volumes" Apr 16 20:16:05.184156 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.184132 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dc3a448-0a69-45c2-9725-08e1794e02d3-config-out\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.184279 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.184165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.184279 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.184185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dc3a448-0a69-45c2-9725-08e1794e02d3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.184279 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.184238 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.184279 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.184275 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dc3a448-0a69-45c2-9725-08e1794e02d3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.184421 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.184296 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4qx9\" (UniqueName: \"kubernetes.io/projected/3dc3a448-0a69-45c2-9725-08e1794e02d3-kube-api-access-w4qx9\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.184421 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.184325 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-config-volume\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.184421 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.184339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dc3a448-0a69-45c2-9725-08e1794e02d3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.184421 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.184374 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.184421 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.184395 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-web-config\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.184421 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.184416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.184610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.184433 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3dc3a448-0a69-45c2-9725-08e1794e02d3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.184610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.184456 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285026 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.284995 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-config-volume\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285026 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.285027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dc3a448-0a69-45c2-9725-08e1794e02d3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285223 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.285049 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285223 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.285067 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-web-config\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285223 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.285090 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285374 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.285219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3dc3a448-0a69-45c2-9725-08e1794e02d3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285374 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.285277 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285374 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.285323 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dc3a448-0a69-45c2-9725-08e1794e02d3-config-out\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285374 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.285351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285574 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.285379 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dc3a448-0a69-45c2-9725-08e1794e02d3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285574 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.285433 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285574 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.285471 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dc3a448-0a69-45c2-9725-08e1794e02d3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285574 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.285500 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4qx9\" (UniqueName: \"kubernetes.io/projected/3dc3a448-0a69-45c2-9725-08e1794e02d3-kube-api-access-w4qx9\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.285758 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.285639 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3dc3a448-0a69-45c2-9725-08e1794e02d3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.286389 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.286359 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dc3a448-0a69-45c2-9725-08e1794e02d3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.288452 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.288122 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dc3a448-0a69-45c2-9725-08e1794e02d3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.288452 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.288249 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.288452 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.288289 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-web-config\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.288452 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.288368 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.288452 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.288409 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-config-volume\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.288452 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.288463 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.288711 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.288458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.288711 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.288671 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dc3a448-0a69-45c2-9725-08e1794e02d3-config-out\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.288947 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.288929 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dc3a448-0a69-45c2-9725-08e1794e02d3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.290066 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.290046 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3dc3a448-0a69-45c2-9725-08e1794e02d3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.293113 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.293096 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4qx9\" (UniqueName: \"kubernetes.io/projected/3dc3a448-0a69-45c2-9725-08e1794e02d3-kube-api-access-w4qx9\") pod \"alertmanager-main-0\" (UID: \"3dc3a448-0a69-45c2-9725-08e1794e02d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.406905 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.406818 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:16:05.542245 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:05.529623 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:16:06.035488 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:06.035454 2572 generic.go:358] "Generic (PLEG): container finished" podID="3dc3a448-0a69-45c2-9725-08e1794e02d3" containerID="6b747475c2d6ad8fd1d682f45fa7c7ac1b765adcf148315876bd29c63c503577" exitCode=0 Apr 16 20:16:06.035899 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:06.035506 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3dc3a448-0a69-45c2-9725-08e1794e02d3","Type":"ContainerDied","Data":"6b747475c2d6ad8fd1d682f45fa7c7ac1b765adcf148315876bd29c63c503577"} Apr 16 20:16:06.035899 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:06.035531 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3dc3a448-0a69-45c2-9725-08e1794e02d3","Type":"ContainerStarted","Data":"0e4aeddef018c2693c5db300de4dd9b67d97367a1c7743defab9a5f41dbccc21"} Apr 16 20:16:07.040834 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.040797 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3dc3a448-0a69-45c2-9725-08e1794e02d3","Type":"ContainerStarted","Data":"86f537b61ce3ba17fea8d144fb133dfb01484c4ff617dcea832fc96d4ab8ef10"} Apr 16 20:16:07.040834 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.040835 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3dc3a448-0a69-45c2-9725-08e1794e02d3","Type":"ContainerStarted","Data":"3b824159a2655a0830a4d3b7f36d723ce627c59b74cfbbefcd03b142bcd395c6"} Apr 16 20:16:07.041267 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.040848 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3dc3a448-0a69-45c2-9725-08e1794e02d3","Type":"ContainerStarted","Data":"0d05760da25ffd5ac915859b9a628f1b5bb8ab0d8ea2dc6f5fdb4087d0ec6e83"} Apr 16 20:16:07.041267 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.040860 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3dc3a448-0a69-45c2-9725-08e1794e02d3","Type":"ContainerStarted","Data":"3d22447952a2541fc38a60f484313e1480714632b5ce40a78cee8811e2a3da30"} Apr 16 20:16:07.041267 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.040886 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3dc3a448-0a69-45c2-9725-08e1794e02d3","Type":"ContainerStarted","Data":"29e294cd76eb8c38cab848998ecef35d4557695fcf6c9d4b6028c70ed67634d6"} Apr 16 20:16:07.041267 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.040903 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3dc3a448-0a69-45c2-9725-08e1794e02d3","Type":"ContainerStarted","Data":"d16a21e91ba5ed96d155a4f13a7eba3fe7734cc343d6c38877bca45ddb2733b7"} Apr 16 20:16:07.065495 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.065439 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.06542198 podStartE2EDuration="2.06542198s" podCreationTimestamp="2026-04-16 20:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:16:07.064956358 +0000 UTC m=+258.501944866" watchObservedRunningTime="2026-04-16 20:16:07.06542198 +0000 UTC m=+258.502410489" Apr 16 20:16:07.704473 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.704435 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:16:07.704943 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.704861 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="prometheus" containerID="cri-o://b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a" gracePeriod=600 Apr 16 20:16:07.704943 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.704908 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="kube-rbac-proxy" containerID="cri-o://d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4" gracePeriod=600 Apr 16 20:16:07.705175 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.704936 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="kube-rbac-proxy-thanos" containerID="cri-o://c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67" gracePeriod=600 Apr 16 20:16:07.705175 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.704957 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="thanos-sidecar" containerID="cri-o://0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44" gracePeriod=600 Apr 16 20:16:07.705175 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.705024 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="config-reloader" containerID="cri-o://81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca" gracePeriod=600 Apr 16 20:16:07.705175 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.705023 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="kube-rbac-proxy-web" containerID="cri-o://cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175" gracePeriod=600 Apr 16 20:16:07.946501 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:07.946477 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.010310 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.010267 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-kube-rbac-proxy\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.010310 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.010316 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-serving-certs-ca-bundle\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.010594 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.010369 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.010798 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.010863 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-metrics-client-certs\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.010905 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-config\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.010905 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.010936 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-web-config\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.010959 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-grpc-tls\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.010989 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-config-out\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.011022 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-thanos-prometheus-http-client-file\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.011072 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-tls\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.011097 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-k8s-rulefiles-0\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.011122 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-tls-assets\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.011147 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-k8s-db\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.011178 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzmft\" (UniqueName: \"kubernetes.io/projected/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-kube-api-access-xzmft\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.011206 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-trusted-ca-bundle\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.011253 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-metrics-client-ca\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.011276 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-kubelet-serving-ca-bundle\") pod \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\" (UID: \"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70\") " Apr 16 20:16:08.013451 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.011539 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.014396 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.011857 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:08.014396 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.012954 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:08.014396 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.013863 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:08.014396 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.013974 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:08.015041 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.014544 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:08.015041 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.014584 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:08.015241 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.015202 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:08.015363 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.015334 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:08.017278 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.017252 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:08.017750 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.017709 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:08.017750 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.017734 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-config" (OuterVolumeSpecName: "config") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:08.017916 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.017807 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-config-out" (OuterVolumeSpecName: "config-out") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:08.017916 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.017836 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:08.018035 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.017918 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:08.018440 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.018414 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:08.019397 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.019366 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-kube-api-access-xzmft" (OuterVolumeSpecName: "kube-api-access-xzmft") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "kube-api-access-xzmft". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:08.029747 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.029719 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-web-config" (OuterVolumeSpecName: "web-config") pod "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" (UID: "7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:08.047553 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047524 2572 generic.go:358] "Generic (PLEG): container finished" podID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerID="c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67" exitCode=0 Apr 16 20:16:08.047553 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047549 2572 generic.go:358] "Generic (PLEG): container finished" podID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerID="d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4" exitCode=0 Apr 16 20:16:08.047553 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047555 2572 generic.go:358] "Generic (PLEG): container finished" podID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerID="cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175" exitCode=0 Apr 16 20:16:08.048008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047562 2572 generic.go:358] "Generic (PLEG): container finished" podID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerID="0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44" exitCode=0 Apr 16 20:16:08.048008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047567 2572 generic.go:358] "Generic (PLEG): container finished" podID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerID="81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca" exitCode=0 Apr 16 20:16:08.048008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047572 2572 generic.go:358] "Generic (PLEG): container finished" podID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerID="b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a" exitCode=0 Apr 16 20:16:08.048008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047599 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerDied","Data":"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67"} Apr 16 20:16:08.048008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047621 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.048008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047635 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerDied","Data":"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4"} Apr 16 20:16:08.048008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047648 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerDied","Data":"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175"} Apr 16 20:16:08.048008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047657 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerDied","Data":"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44"} Apr 16 20:16:08.048008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047667 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerDied","Data":"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca"} Apr 16 20:16:08.048008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047676 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerDied","Data":"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a"} Apr 16 20:16:08.048008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047685 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70","Type":"ContainerDied","Data":"0a944920e243fc1f95b02f5ccf7e84f161ea3c9b46b3d4311e05c547f4f6709a"} Apr 16 20:16:08.048008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.047707 2572 scope.go:117] "RemoveContainer" containerID="c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67" Apr 16 20:16:08.055103 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.055081 2572 scope.go:117] "RemoveContainer" containerID="d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4" Apr 16 20:16:08.061826 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.061809 2572 scope.go:117] "RemoveContainer" containerID="cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175" Apr 16 20:16:08.069919 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.069902 2572 scope.go:117] "RemoveContainer" containerID="0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44" Apr 16 20:16:08.071948 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.071862 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:16:08.073934 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.073910 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:16:08.077963 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.077945 2572 scope.go:117] "RemoveContainer" containerID="81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca" Apr 16 20:16:08.084343 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.084323 2572 scope.go:117] "RemoveContainer" containerID="b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a" Apr 16 20:16:08.092512 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.092493 2572 scope.go:117] "RemoveContainer" containerID="08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c" Apr 16 20:16:08.098466 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098444 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:16:08.098724 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098712 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="kube-rbac-proxy-thanos" Apr 16 20:16:08.098766 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098726 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="kube-rbac-proxy-thanos" Apr 16 20:16:08.098766 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098739 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="config-reloader" Apr 16 20:16:08.098766 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098745 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="config-reloader" Apr 16 20:16:08.098766 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098756 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="init-config-reloader" Apr 16 20:16:08.098766 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098762 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="init-config-reloader" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098770 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="prometheus" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098775 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="prometheus" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098781 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="kube-rbac-proxy-web" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098786 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="kube-rbac-proxy-web" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098796 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="thanos-sidecar" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098800 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="thanos-sidecar" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098806 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="kube-rbac-proxy" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098811 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="kube-rbac-proxy" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098862 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="kube-rbac-proxy-web" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098894 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="kube-rbac-proxy" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098903 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="config-reloader" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098908 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="thanos-sidecar" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098915 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="kube-rbac-proxy-thanos" Apr 16 20:16:08.098926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.098920 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" containerName="prometheus" Apr 16 20:16:08.099379 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.099310 2572 scope.go:117] "RemoveContainer" containerID="c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67" Apr 16 20:16:08.099593 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:08.099574 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67\": container with ID starting with c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67 not found: ID does not exist" containerID="c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67" Apr 16 20:16:08.099651 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.099606 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67"} err="failed to get container status \"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67\": rpc error: code = NotFound desc = could not find container \"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67\": container with ID starting with c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67 not found: ID does not exist" Apr 16 20:16:08.099651 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.099634 2572 scope.go:117] "RemoveContainer" containerID="d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4" Apr 16 20:16:08.099920 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:08.099892 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4\": container with ID starting with d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4 not found: ID does not exist" containerID="d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4" Apr 16 20:16:08.099998 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.099927 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4"} err="failed to get container status \"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4\": rpc error: code = NotFound desc = could not find container \"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4\": container with ID starting with d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4 not found: ID does not exist" Apr 16 20:16:08.099998 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.099949 2572 scope.go:117] "RemoveContainer" containerID="cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175" Apr 16 20:16:08.100187 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:08.100169 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175\": container with ID starting with cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175 not found: ID does not exist" containerID="cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175" Apr 16 20:16:08.100225 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.100193 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175"} err="failed to get container status \"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175\": rpc error: code = NotFound desc = could not find container \"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175\": container with ID starting with cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175 not found: ID does not exist" Apr 16 20:16:08.100225 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.100207 2572 scope.go:117] "RemoveContainer" containerID="0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44" Apr 16 20:16:08.100419 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:08.100401 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44\": container with ID starting with 0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44 not found: ID does not exist" containerID="0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44" Apr 16 20:16:08.100479 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.100428 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44"} err="failed to get container status \"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44\": rpc error: code = NotFound desc = could not find container \"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44\": container with ID starting with 0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44 not found: ID does not exist" Apr 16 20:16:08.100479 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.100447 2572 scope.go:117] "RemoveContainer" containerID="81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca" Apr 16 20:16:08.100674 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:08.100657 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca\": container with ID starting with 81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca not found: ID does not exist" containerID="81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca" Apr 16 20:16:08.100708 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.100679 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca"} err="failed to get container status \"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca\": rpc error: code = NotFound desc = could not find container \"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca\": container with ID starting with 81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca not found: ID does not exist" Apr 16 20:16:08.100708 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.100693 2572 scope.go:117] "RemoveContainer" containerID="b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a" Apr 16 20:16:08.100948 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:08.100930 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a\": container with ID starting with b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a not found: ID does not exist" containerID="b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a" Apr 16 20:16:08.101012 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.100951 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a"} err="failed to get container status \"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a\": rpc error: code = NotFound desc = could not find container \"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a\": container with ID starting with b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a not found: ID does not exist" Apr 16 20:16:08.101012 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.100964 2572 scope.go:117] "RemoveContainer" containerID="08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c" Apr 16 20:16:08.101232 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:16:08.101206 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c\": container with ID starting with 08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c not found: ID does not exist" containerID="08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c" Apr 16 20:16:08.101325 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.101237 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c"} err="failed to get container status \"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c\": rpc error: code = NotFound desc = could not find container \"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c\": container with ID starting with 08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c not found: ID does not exist" Apr 16 20:16:08.101325 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.101256 2572 scope.go:117] "RemoveContainer" containerID="c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67" Apr 16 20:16:08.101616 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.101580 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67"} err="failed to get container status \"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67\": rpc error: code = NotFound desc = could not find container \"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67\": container with ID starting with c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67 not found: ID does not exist" Apr 16 20:16:08.101616 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.101606 2572 scope.go:117] "RemoveContainer" containerID="d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4" Apr 16 20:16:08.101884 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.101846 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4"} err="failed to get container status \"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4\": rpc error: code = NotFound desc = could not find container \"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4\": container with ID starting with d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4 not found: ID does not exist" Apr 16 20:16:08.101953 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.101883 2572 scope.go:117] "RemoveContainer" containerID="cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175" Apr 16 20:16:08.102128 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.102109 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175"} err="failed to get container status \"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175\": rpc error: code = NotFound desc = could not find container \"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175\": container with ID starting with cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175 not found: ID does not exist" Apr 16 20:16:08.102128 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.102127 2572 scope.go:117] "RemoveContainer" containerID="0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44" Apr 16 20:16:08.102384 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.102364 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44"} err="failed to get container status \"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44\": rpc error: code = NotFound desc = could not find container \"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44\": container with ID starting with 0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44 not found: ID does not exist" Apr 16 20:16:08.102463 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.102384 2572 scope.go:117] "RemoveContainer" containerID="81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca" Apr 16 20:16:08.102638 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.102613 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca"} err="failed to get container status \"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca\": rpc error: code = NotFound desc = could not find container \"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca\": container with ID starting with 81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca not found: ID does not exist" Apr 16 20:16:08.102705 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.102639 2572 scope.go:117] "RemoveContainer" containerID="b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a" Apr 16 20:16:08.102936 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.102909 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a"} err="failed to get container status \"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a\": rpc error: code = NotFound desc = could not find container \"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a\": container with ID starting with b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a not found: ID does not exist" Apr 16 20:16:08.103002 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.102938 2572 scope.go:117] "RemoveContainer" containerID="08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c" Apr 16 20:16:08.103157 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.103141 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c"} err="failed to get container status \"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c\": rpc error: code = NotFound desc = could not find container \"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c\": container with ID starting with 08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c not found: ID does not exist" Apr 16 20:16:08.103204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.103158 2572 scope.go:117] "RemoveContainer" containerID="c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67" Apr 16 20:16:08.103353 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.103338 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67"} err="failed to get container status \"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67\": rpc error: code = NotFound desc = could not find container \"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67\": container with ID starting with c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67 not found: ID does not exist" Apr 16 20:16:08.103396 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.103354 2572 scope.go:117] "RemoveContainer" containerID="d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4" Apr 16 20:16:08.103576 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.103556 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4"} err="failed to get container status \"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4\": rpc error: code = NotFound desc = could not find container \"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4\": container with ID starting with d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4 not found: ID does not exist" Apr 16 20:16:08.103646 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.103577 2572 scope.go:117] "RemoveContainer" containerID="cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175" Apr 16 20:16:08.103823 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.103802 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175"} err="failed to get container status \"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175\": rpc error: code = NotFound desc = could not find container \"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175\": container with ID starting with cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175 not found: ID does not exist" Apr 16 20:16:08.103898 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.103824 2572 scope.go:117] "RemoveContainer" containerID="0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44" Apr 16 20:16:08.104068 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.104051 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44"} err="failed to get container status \"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44\": rpc error: code = NotFound desc = could not find container \"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44\": container with ID starting with 0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44 not found: ID does not exist" Apr 16 20:16:08.104124 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.104068 2572 scope.go:117] "RemoveContainer" containerID="81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca" Apr 16 20:16:08.104238 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.104220 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.104282 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.104245 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca"} err="failed to get container status \"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca\": rpc error: code = NotFound desc = could not find container \"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca\": container with ID starting with 81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca not found: ID does not exist" Apr 16 20:16:08.104282 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.104256 2572 scope.go:117] "RemoveContainer" containerID="b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a" Apr 16 20:16:08.104571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.104477 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a"} err="failed to get container status \"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a\": rpc error: code = NotFound desc = could not find container \"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a\": container with ID starting with b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a not found: ID does not exist" Apr 16 20:16:08.104571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.104505 2572 scope.go:117] "RemoveContainer" containerID="08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c" Apr 16 20:16:08.104825 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.104799 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c"} err="failed to get container status \"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c\": rpc error: code = NotFound desc = could not find container \"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c\": container with ID starting with 08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c not found: ID does not exist" Apr 16 20:16:08.104825 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.104825 2572 scope.go:117] "RemoveContainer" containerID="c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67" Apr 16 20:16:08.105094 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.105078 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67"} err="failed to get container status \"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67\": rpc error: code = NotFound desc = could not find container \"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67\": container with ID starting with c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67 not found: ID does not exist" Apr 16 20:16:08.105140 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.105096 2572 scope.go:117] "RemoveContainer" containerID="d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4" Apr 16 20:16:08.105296 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.105277 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4"} err="failed to get container status \"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4\": rpc error: code = NotFound desc = could not find container \"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4\": container with ID starting with d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4 not found: ID does not exist" Apr 16 20:16:08.105344 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.105297 2572 scope.go:117] "RemoveContainer" containerID="cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175" Apr 16 20:16:08.105547 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.105524 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175"} err="failed to get container status \"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175\": rpc error: code = NotFound desc = could not find container \"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175\": container with ID starting with cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175 not found: ID does not exist" Apr 16 20:16:08.105623 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.105549 2572 scope.go:117] "RemoveContainer" containerID="0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44" Apr 16 20:16:08.105786 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.105766 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44"} err="failed to get container status \"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44\": rpc error: code = NotFound desc = could not find container \"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44\": container with ID starting with 0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44 not found: ID does not exist" Apr 16 20:16:08.105786 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.105785 2572 scope.go:117] "RemoveContainer" containerID="81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca" Apr 16 20:16:08.106071 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.106046 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca"} err="failed to get container status \"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca\": rpc error: code = NotFound desc = could not find container \"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca\": container with ID starting with 81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca not found: ID does not exist" Apr 16 20:16:08.106123 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.106075 2572 scope.go:117] "RemoveContainer" containerID="b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a" Apr 16 20:16:08.106467 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.106442 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a"} err="failed to get container status \"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a\": rpc error: code = NotFound desc = could not find container \"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a\": container with ID starting with b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a not found: ID does not exist" Apr 16 20:16:08.106553 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.106468 2572 scope.go:117] "RemoveContainer" containerID="08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c" Apr 16 20:16:08.106553 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.106501 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 20:16:08.106553 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.106522 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 20:16:08.106553 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.106502 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 20:16:08.106553 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.106538 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 20:16:08.106775 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.106647 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 20:16:08.106855 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.106828 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c"} err="failed to get container status \"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c\": rpc error: code = NotFound desc = could not find container \"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c\": container with ID starting with 08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c not found: ID does not exist" Apr 16 20:16:08.106855 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.106855 2572 scope.go:117] "RemoveContainer" containerID="c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67" Apr 16 20:16:08.107025 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.106857 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-gcm88\"" Apr 16 20:16:08.107108 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.107091 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4h7pohcsqshto\"" Apr 16 20:16:08.107246 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.107134 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 20:16:08.107336 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.107265 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67"} err="failed to get container status \"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67\": rpc error: code = NotFound desc = could not find container \"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67\": container with ID starting with c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67 not found: ID does not exist" Apr 16 20:16:08.107336 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.107290 2572 scope.go:117] "RemoveContainer" containerID="d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4" Apr 16 20:16:08.107336 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.107313 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 20:16:08.107484 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.107412 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 20:16:08.107484 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.107461 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 20:16:08.107596 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.107566 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4"} err="failed to get container status \"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4\": rpc error: code = NotFound desc = could not find container \"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4\": container with ID starting with d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4 not found: ID does not exist" Apr 16 20:16:08.107596 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.107593 2572 scope.go:117] "RemoveContainer" containerID="cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175" Apr 16 20:16:08.107716 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.107699 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 20:16:08.108005 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.107975 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175"} err="failed to get container status \"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175\": rpc error: code = NotFound desc = could not find container \"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175\": container with ID starting with cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175 not found: ID does not exist" Apr 16 20:16:08.108085 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.108005 2572 scope.go:117] "RemoveContainer" containerID="0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44" Apr 16 20:16:08.108340 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.108314 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44"} err="failed to get container status \"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44\": rpc error: code = NotFound desc = could not find container \"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44\": container with ID starting with 0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44 not found: ID does not exist" Apr 16 20:16:08.108420 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.108343 2572 scope.go:117] "RemoveContainer" containerID="81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca" Apr 16 20:16:08.108607 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.108578 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca"} err="failed to get container status \"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca\": rpc error: code = NotFound desc = could not find container \"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca\": container with ID starting with 81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca not found: ID does not exist" Apr 16 20:16:08.108607 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.108605 2572 scope.go:117] "RemoveContainer" containerID="b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a" Apr 16 20:16:08.108925 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.108898 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a"} err="failed to get container status \"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a\": rpc error: code = NotFound desc = could not find container \"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a\": container with ID starting with b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a not found: ID does not exist" Apr 16 20:16:08.108925 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.108925 2572 scope.go:117] "RemoveContainer" containerID="08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c" Apr 16 20:16:08.109185 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.109164 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c"} err="failed to get container status \"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c\": rpc error: code = NotFound desc = could not find container \"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c\": container with ID starting with 08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c not found: ID does not exist" Apr 16 20:16:08.109251 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.109187 2572 scope.go:117] "RemoveContainer" containerID="c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67" Apr 16 20:16:08.109405 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.109385 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67"} err="failed to get container status \"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67\": rpc error: code = NotFound desc = could not find container \"c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67\": container with ID starting with c02375bb669ac5c088aeedd4a36bfed892009be2b049bfa67f6b439c2385bb67 not found: ID does not exist" Apr 16 20:16:08.109481 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.109408 2572 scope.go:117] "RemoveContainer" containerID="d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4" Apr 16 20:16:08.109646 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.109628 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 20:16:08.109719 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.109652 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4"} err="failed to get container status \"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4\": rpc error: code = NotFound desc = could not find container \"d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4\": container with ID starting with d4ebe45ae76dd749b487a49f3da3975492a995dacf80fb95710c866aaeb76ea4 not found: ID does not exist" Apr 16 20:16:08.109719 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.109673 2572 scope.go:117] "RemoveContainer" containerID="cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175" Apr 16 20:16:08.109935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.109916 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175"} err="failed to get container status \"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175\": rpc error: code = NotFound desc = could not find container \"cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175\": container with ID starting with cb7215972689cc6c71521f2d70f82e751e1ee7e79a5f129eb2e1caf96fea7175 not found: ID does not exist" Apr 16 20:16:08.109986 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.109940 2572 scope.go:117] "RemoveContainer" containerID="0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44" Apr 16 20:16:08.110186 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.110165 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44"} err="failed to get container status \"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44\": rpc error: code = NotFound desc = could not find container \"0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44\": container with ID starting with 0d5f5b226e087df52a35f3f5486025bf68c782d469d540dc86a9a91ff0fa6f44 not found: ID does not exist" Apr 16 20:16:08.110186 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.110185 2572 scope.go:117] "RemoveContainer" containerID="81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca" Apr 16 20:16:08.110438 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.110418 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca"} err="failed to get container status \"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca\": rpc error: code = NotFound desc = could not find container \"81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca\": container with ID starting with 81845c62ce865d3fc388a94ba9a651cffb0c8e4b451dc1f682628c88d0e5f5ca not found: ID does not exist" Apr 16 20:16:08.110438 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.110436 2572 scope.go:117] "RemoveContainer" containerID="b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a" Apr 16 20:16:08.110681 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.110660 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a"} err="failed to get container status \"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a\": rpc error: code = NotFound desc = could not find container \"b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a\": container with ID starting with b04dbdc36a4d9c8bfc1dd84ff14054966613e78f4361ff750c64ab5f702d212a not found: ID does not exist" Apr 16 20:16:08.110681 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.110681 2572 scope.go:117] "RemoveContainer" containerID="08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c" Apr 16 20:16:08.110939 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.110919 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c"} err="failed to get container status \"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c\": rpc error: code = NotFound desc = could not find container \"08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c\": container with ID starting with 08aeb652e17b8ac770418cdeb0dcdbb0bd030687398be025423024f871986e7c not found: ID does not exist" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112419 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112440 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112459 2572 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-metrics-client-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112473 2572 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-config\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112486 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-web-config\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112501 2572 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-grpc-tls\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112514 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-config-out\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112527 2572 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112541 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112556 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112570 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-tls-assets\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112584 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-k8s-db\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112599 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xzmft\" (UniqueName: \"kubernetes.io/projected/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-kube-api-access-xzmft\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112613 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112627 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-metrics-client-ca\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112645 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.112748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.112678 2572 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70-secret-kube-rbac-proxy\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:16:08.113505 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.113310 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 20:16:08.121441 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.121420 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:16:08.213829 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.213788 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214012 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.213854 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214012 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.213911 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214012 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.213936 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214012 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.213963 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214198 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.214033 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214198 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.214090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7kdc\" (UniqueName: \"kubernetes.io/projected/9422c34e-0f4b-499a-bf52-61c9b32c315a-kube-api-access-z7kdc\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214198 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.214112 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214198 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.214129 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9422c34e-0f4b-499a-bf52-61c9b32c315a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214198 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.214157 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9422c34e-0f4b-499a-bf52-61c9b32c315a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214198 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.214174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-config\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214198 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.214191 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9422c34e-0f4b-499a-bf52-61c9b32c315a-config-out\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214424 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.214218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-web-config\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214424 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.214286 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214424 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.214349 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214424 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.214376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214424 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.214404 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.214424 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.214421 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315228 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315128 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-web-config\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315228 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315233 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315266 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315318 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315352 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315376 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315402 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315840 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315840 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315840 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315488 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315840 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315525 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7kdc\" (UniqueName: \"kubernetes.io/projected/9422c34e-0f4b-499a-bf52-61c9b32c315a-kube-api-access-z7kdc\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315840 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315554 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315840 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315580 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9422c34e-0f4b-499a-bf52-61c9b32c315a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315840 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315610 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9422c34e-0f4b-499a-bf52-61c9b32c315a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315840 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315669 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-config\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.315840 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.315696 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9422c34e-0f4b-499a-bf52-61c9b32c315a-config-out\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.316308 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.316211 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.316308 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.316233 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.318243 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.318210 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9422c34e-0f4b-499a-bf52-61c9b32c315a-config-out\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.318414 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.318389 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-web-config\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.319144 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.318477 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.319144 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.318507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.319144 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.319054 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.319325 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.319201 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.319325 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.319252 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.319486 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.319467 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9422c34e-0f4b-499a-bf52-61c9b32c315a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.319619 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.319583 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.319619 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.319609 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.320080 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.320061 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9422c34e-0f4b-499a-bf52-61c9b32c315a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.320470 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.320449 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.320999 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.320980 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.321273 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.321253 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9422c34e-0f4b-499a-bf52-61c9b32c315a-config\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.321343 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.321321 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9422c34e-0f4b-499a-bf52-61c9b32c315a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.323586 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.323566 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7kdc\" (UniqueName: \"kubernetes.io/projected/9422c34e-0f4b-499a-bf52-61c9b32c315a-kube-api-access-z7kdc\") pod \"prometheus-k8s-0\" (UID: \"9422c34e-0f4b-499a-bf52-61c9b32c315a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.416232 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.416202 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:08.542711 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:08.542682 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:16:08.544710 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:16:08.544681 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9422c34e_0f4b_499a_bf52_61c9b32c315a.slice/crio-5930ddbf28859474134487ce1ec93f2eb9fa349a3b110cb52b42e6a0509d9354 WatchSource:0}: Error finding container 5930ddbf28859474134487ce1ec93f2eb9fa349a3b110cb52b42e6a0509d9354: Status 404 returned error can't find the container with id 5930ddbf28859474134487ce1ec93f2eb9fa349a3b110cb52b42e6a0509d9354 Apr 16 20:16:09.052552 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:09.052516 2572 generic.go:358] "Generic (PLEG): container finished" podID="9422c34e-0f4b-499a-bf52-61c9b32c315a" containerID="a780b735dca6ff07ebfd33b693bd2c553e7ca3365738b47c0378caa31eed938d" exitCode=0 Apr 16 20:16:09.052988 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:09.052574 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9422c34e-0f4b-499a-bf52-61c9b32c315a","Type":"ContainerDied","Data":"a780b735dca6ff07ebfd33b693bd2c553e7ca3365738b47c0378caa31eed938d"} Apr 16 20:16:09.052988 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:09.052599 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9422c34e-0f4b-499a-bf52-61c9b32c315a","Type":"ContainerStarted","Data":"5930ddbf28859474134487ce1ec93f2eb9fa349a3b110cb52b42e6a0509d9354"} Apr 16 20:16:09.148956 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:09.148926 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70" path="/var/lib/kubelet/pods/7fc0a85a-5ea2-42de-a9f2-8b1399fc3e70/volumes" Apr 16 20:16:10.058705 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:10.058669 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9422c34e-0f4b-499a-bf52-61c9b32c315a","Type":"ContainerStarted","Data":"9a590d9a8d266c64947e8982ad1ad9aab77c8441c80b3542ff8acd4b4850baf9"} Apr 16 20:16:10.058705 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:10.058710 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9422c34e-0f4b-499a-bf52-61c9b32c315a","Type":"ContainerStarted","Data":"e6616a4d235e2b6124ee080a56d7ee809f69397281538beebc5ca480d00c9e33"} Apr 16 20:16:10.059118 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:10.058724 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9422c34e-0f4b-499a-bf52-61c9b32c315a","Type":"ContainerStarted","Data":"0b405774960fb1fae46d6eafd8f206b36784827378b76632510fc22d3706b036"} Apr 16 20:16:10.059118 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:10.058738 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9422c34e-0f4b-499a-bf52-61c9b32c315a","Type":"ContainerStarted","Data":"92d08be27fbc64af7e18a190db468aba0e0c2d66243367f24dcfb3bb15768168"} Apr 16 20:16:10.059118 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:10.058752 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9422c34e-0f4b-499a-bf52-61c9b32c315a","Type":"ContainerStarted","Data":"d0e6f7f9491f046ffbd7b704f98b4e1cd14c33ec48d1ebdd664767274efeb4c2"} Apr 16 20:16:10.059118 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:10.058763 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9422c34e-0f4b-499a-bf52-61c9b32c315a","Type":"ContainerStarted","Data":"53f96229b909227bb9ed2aa7c7f163de2c6aeb8c97bc06e67112d5ad258030a5"} Apr 16 20:16:10.085807 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:10.085153 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.085135357 podStartE2EDuration="2.085135357s" podCreationTimestamp="2026-04-16 20:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:16:10.083108487 +0000 UTC m=+261.520096997" watchObservedRunningTime="2026-04-16 20:16:10.085135357 +0000 UTC m=+261.522123866" Apr 16 20:16:13.417277 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:13.417242 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:16:44.423647 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.423614 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-h5zvt"] Apr 16 20:16:44.427573 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.427557 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h5zvt" Apr 16 20:16:44.429845 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.429827 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:16:44.435087 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.435064 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h5zvt"] Apr 16 20:16:44.614979 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.614936 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4ee8f3a2-dd63-4782-a67d-759803bc1b0d-kubelet-config\") pod \"global-pull-secret-syncer-h5zvt\" (UID: \"4ee8f3a2-dd63-4782-a67d-759803bc1b0d\") " pod="kube-system/global-pull-secret-syncer-h5zvt" Apr 16 20:16:44.614979 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.614982 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ee8f3a2-dd63-4782-a67d-759803bc1b0d-original-pull-secret\") pod \"global-pull-secret-syncer-h5zvt\" (UID: \"4ee8f3a2-dd63-4782-a67d-759803bc1b0d\") " pod="kube-system/global-pull-secret-syncer-h5zvt" Apr 16 20:16:44.615207 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.615079 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4ee8f3a2-dd63-4782-a67d-759803bc1b0d-dbus\") pod \"global-pull-secret-syncer-h5zvt\" (UID: \"4ee8f3a2-dd63-4782-a67d-759803bc1b0d\") " pod="kube-system/global-pull-secret-syncer-h5zvt" Apr 16 20:16:44.715566 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.715494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4ee8f3a2-dd63-4782-a67d-759803bc1b0d-kubelet-config\") pod \"global-pull-secret-syncer-h5zvt\" (UID: \"4ee8f3a2-dd63-4782-a67d-759803bc1b0d\") " pod="kube-system/global-pull-secret-syncer-h5zvt" Apr 16 20:16:44.715566 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.715528 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ee8f3a2-dd63-4782-a67d-759803bc1b0d-original-pull-secret\") pod \"global-pull-secret-syncer-h5zvt\" (UID: \"4ee8f3a2-dd63-4782-a67d-759803bc1b0d\") " pod="kube-system/global-pull-secret-syncer-h5zvt" Apr 16 20:16:44.715566 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.715565 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4ee8f3a2-dd63-4782-a67d-759803bc1b0d-dbus\") pod \"global-pull-secret-syncer-h5zvt\" (UID: \"4ee8f3a2-dd63-4782-a67d-759803bc1b0d\") " pod="kube-system/global-pull-secret-syncer-h5zvt" Apr 16 20:16:44.715842 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.715613 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4ee8f3a2-dd63-4782-a67d-759803bc1b0d-kubelet-config\") pod \"global-pull-secret-syncer-h5zvt\" (UID: \"4ee8f3a2-dd63-4782-a67d-759803bc1b0d\") " pod="kube-system/global-pull-secret-syncer-h5zvt" Apr 16 20:16:44.715842 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.715710 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4ee8f3a2-dd63-4782-a67d-759803bc1b0d-dbus\") pod \"global-pull-secret-syncer-h5zvt\" (UID: \"4ee8f3a2-dd63-4782-a67d-759803bc1b0d\") " pod="kube-system/global-pull-secret-syncer-h5zvt" Apr 16 20:16:44.717795 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.717774 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ee8f3a2-dd63-4782-a67d-759803bc1b0d-original-pull-secret\") pod \"global-pull-secret-syncer-h5zvt\" (UID: \"4ee8f3a2-dd63-4782-a67d-759803bc1b0d\") " pod="kube-system/global-pull-secret-syncer-h5zvt" Apr 16 20:16:44.736886 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.736859 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h5zvt" Apr 16 20:16:44.852744 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:44.852676 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h5zvt"] Apr 16 20:16:44.855406 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:16:44.855372 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee8f3a2_dd63_4782_a67d_759803bc1b0d.slice/crio-5746e00571be44104252291bc08d12984125a3d89e1968179658ff7228fa6474 WatchSource:0}: Error finding container 5746e00571be44104252291bc08d12984125a3d89e1968179658ff7228fa6474: Status 404 returned error can't find the container with id 5746e00571be44104252291bc08d12984125a3d89e1968179658ff7228fa6474 Apr 16 20:16:45.163236 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:45.163202 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h5zvt" event={"ID":"4ee8f3a2-dd63-4782-a67d-759803bc1b0d","Type":"ContainerStarted","Data":"5746e00571be44104252291bc08d12984125a3d89e1968179658ff7228fa6474"} Apr 16 20:16:49.077031 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:49.077005 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:16:49.077455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:49.077438 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:16:49.082547 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:49.082529 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:16:50.181677 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:50.181638 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h5zvt" event={"ID":"4ee8f3a2-dd63-4782-a67d-759803bc1b0d","Type":"ContainerStarted","Data":"ab0eb4bc235c30a9df8fdd458e00989ea4a01709a0561a70d7dfa8b13f647b95"} Apr 16 20:16:50.198778 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:16:50.198734 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-h5zvt" podStartSLOduration=1.939398528 podStartE2EDuration="6.198721527s" podCreationTimestamp="2026-04-16 20:16:44 +0000 UTC" firstStartedPulling="2026-04-16 20:16:44.857075426 +0000 UTC m=+296.294063912" lastFinishedPulling="2026-04-16 20:16:49.116398422 +0000 UTC m=+300.553386911" observedRunningTime="2026-04-16 20:16:50.197283838 +0000 UTC m=+301.634272357" watchObservedRunningTime="2026-04-16 20:16:50.198721527 +0000 UTC m=+301.635710035" Apr 16 20:17:08.416779 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:17:08.416749 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:17:08.432279 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:17:08.432255 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:17:09.250320 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:17:09.250292 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:18:51.118736 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.118699 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-pphr6"] Apr 16 20:18:51.122036 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.122019 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-pphr6" Apr 16 20:18:51.124392 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.124371 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-dszb8\"" Apr 16 20:18:51.124639 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.124622 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 20:18:51.125130 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.125116 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 20:18:51.130495 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.130471 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-pphr6"] Apr 16 20:18:51.148784 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.148763 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f74v8\" (UniqueName: \"kubernetes.io/projected/e4a192af-5f2a-4074-9896-e1f02c5cc5d0-kube-api-access-f74v8\") pod \"cert-manager-webhook-597b96b99b-pphr6\" (UID: \"e4a192af-5f2a-4074-9896-e1f02c5cc5d0\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pphr6" Apr 16 20:18:51.148895 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.148800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e4a192af-5f2a-4074-9896-e1f02c5cc5d0-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-pphr6\" (UID: \"e4a192af-5f2a-4074-9896-e1f02c5cc5d0\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pphr6" Apr 16 20:18:51.249934 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.249904 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f74v8\" (UniqueName: \"kubernetes.io/projected/e4a192af-5f2a-4074-9896-e1f02c5cc5d0-kube-api-access-f74v8\") pod \"cert-manager-webhook-597b96b99b-pphr6\" (UID: \"e4a192af-5f2a-4074-9896-e1f02c5cc5d0\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pphr6" Apr 16 20:18:51.250039 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.249970 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e4a192af-5f2a-4074-9896-e1f02c5cc5d0-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-pphr6\" (UID: \"e4a192af-5f2a-4074-9896-e1f02c5cc5d0\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pphr6" Apr 16 20:18:51.257744 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.257715 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e4a192af-5f2a-4074-9896-e1f02c5cc5d0-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-pphr6\" (UID: \"e4a192af-5f2a-4074-9896-e1f02c5cc5d0\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pphr6" Apr 16 20:18:51.257916 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.257861 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f74v8\" (UniqueName: \"kubernetes.io/projected/e4a192af-5f2a-4074-9896-e1f02c5cc5d0-kube-api-access-f74v8\") pod \"cert-manager-webhook-597b96b99b-pphr6\" (UID: \"e4a192af-5f2a-4074-9896-e1f02c5cc5d0\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pphr6" Apr 16 20:18:51.443056 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.442972 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-pphr6" Apr 16 20:18:51.561244 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.561183 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-pphr6"] Apr 16 20:18:51.563631 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:18:51.563607 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4a192af_5f2a_4074_9896_e1f02c5cc5d0.slice/crio-277d12728dec766902c4d99d7273611ea220875a1375ff315363c6904aa681bd WatchSource:0}: Error finding container 277d12728dec766902c4d99d7273611ea220875a1375ff315363c6904aa681bd: Status 404 returned error can't find the container with id 277d12728dec766902c4d99d7273611ea220875a1375ff315363c6904aa681bd Apr 16 20:18:51.565348 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.565332 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:18:51.666237 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.666211 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-fp8m4"] Apr 16 20:18:51.670493 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.670478 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-fp8m4" Apr 16 20:18:51.672599 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.672583 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-rbnqx\"" Apr 16 20:18:51.677889 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.677846 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-fp8m4"] Apr 16 20:18:51.754215 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.754188 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr6q6\" (UniqueName: \"kubernetes.io/projected/df8aac33-1091-48d7-a01c-304f08b3014c-kube-api-access-hr6q6\") pod \"cert-manager-759f64656b-fp8m4\" (UID: \"df8aac33-1091-48d7-a01c-304f08b3014c\") " pod="cert-manager/cert-manager-759f64656b-fp8m4" Apr 16 20:18:51.754326 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.754226 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df8aac33-1091-48d7-a01c-304f08b3014c-bound-sa-token\") pod \"cert-manager-759f64656b-fp8m4\" (UID: \"df8aac33-1091-48d7-a01c-304f08b3014c\") " pod="cert-manager/cert-manager-759f64656b-fp8m4" Apr 16 20:18:51.855594 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.855562 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hr6q6\" (UniqueName: \"kubernetes.io/projected/df8aac33-1091-48d7-a01c-304f08b3014c-kube-api-access-hr6q6\") pod \"cert-manager-759f64656b-fp8m4\" (UID: \"df8aac33-1091-48d7-a01c-304f08b3014c\") " pod="cert-manager/cert-manager-759f64656b-fp8m4" Apr 16 20:18:51.855743 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.855640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df8aac33-1091-48d7-a01c-304f08b3014c-bound-sa-token\") pod \"cert-manager-759f64656b-fp8m4\" (UID: \"df8aac33-1091-48d7-a01c-304f08b3014c\") " pod="cert-manager/cert-manager-759f64656b-fp8m4" Apr 16 20:18:51.864165 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.864133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df8aac33-1091-48d7-a01c-304f08b3014c-bound-sa-token\") pod \"cert-manager-759f64656b-fp8m4\" (UID: \"df8aac33-1091-48d7-a01c-304f08b3014c\") " pod="cert-manager/cert-manager-759f64656b-fp8m4" Apr 16 20:18:51.864283 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.864266 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr6q6\" (UniqueName: \"kubernetes.io/projected/df8aac33-1091-48d7-a01c-304f08b3014c-kube-api-access-hr6q6\") pod \"cert-manager-759f64656b-fp8m4\" (UID: \"df8aac33-1091-48d7-a01c-304f08b3014c\") " pod="cert-manager/cert-manager-759f64656b-fp8m4" Apr 16 20:18:51.980645 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:51.980614 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-fp8m4" Apr 16 20:18:52.096721 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:52.096697 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-fp8m4"] Apr 16 20:18:52.098767 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:18:52.098725 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf8aac33_1091_48d7_a01c_304f08b3014c.slice/crio-aabe9ddec1204e0bcc156f751a88c0f5b2f8f7c1543dd5eedab8156aefc50d24 WatchSource:0}: Error finding container aabe9ddec1204e0bcc156f751a88c0f5b2f8f7c1543dd5eedab8156aefc50d24: Status 404 returned error can't find the container with id aabe9ddec1204e0bcc156f751a88c0f5b2f8f7c1543dd5eedab8156aefc50d24 Apr 16 20:18:52.533134 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:52.533095 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-fp8m4" event={"ID":"df8aac33-1091-48d7-a01c-304f08b3014c","Type":"ContainerStarted","Data":"aabe9ddec1204e0bcc156f751a88c0f5b2f8f7c1543dd5eedab8156aefc50d24"} Apr 16 20:18:52.534406 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:52.534377 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-pphr6" event={"ID":"e4a192af-5f2a-4074-9896-e1f02c5cc5d0","Type":"ContainerStarted","Data":"277d12728dec766902c4d99d7273611ea220875a1375ff315363c6904aa681bd"} Apr 16 20:18:55.544250 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:55.544211 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-pphr6" event={"ID":"e4a192af-5f2a-4074-9896-e1f02c5cc5d0","Type":"ContainerStarted","Data":"2d49a0552054d2da14ba16876a5f84fd63758aa48b63ed06ba644233e54918f0"} Apr 16 20:18:55.544692 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:55.544285 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-pphr6" Apr 16 20:18:55.545648 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:55.545622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-fp8m4" event={"ID":"df8aac33-1091-48d7-a01c-304f08b3014c","Type":"ContainerStarted","Data":"fad0360eaeedfd525056ab7f6dbb008d7bd36c0d941ddde7b322ab577d2314ef"} Apr 16 20:18:55.566377 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:55.566331 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-pphr6" podStartSLOduration=0.92373624 podStartE2EDuration="4.566316313s" podCreationTimestamp="2026-04-16 20:18:51 +0000 UTC" firstStartedPulling="2026-04-16 20:18:51.565459922 +0000 UTC m=+423.002448409" lastFinishedPulling="2026-04-16 20:18:55.208039996 +0000 UTC m=+426.645028482" observedRunningTime="2026-04-16 20:18:55.564480295 +0000 UTC m=+427.001468802" watchObservedRunningTime="2026-04-16 20:18:55.566316313 +0000 UTC m=+427.003304820" Apr 16 20:18:55.580828 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:18:55.580772 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-fp8m4" podStartSLOduration=1.463108887 podStartE2EDuration="4.580757198s" podCreationTimestamp="2026-04-16 20:18:51 +0000 UTC" firstStartedPulling="2026-04-16 20:18:52.100542759 +0000 UTC m=+423.537531248" lastFinishedPulling="2026-04-16 20:18:55.218191066 +0000 UTC m=+426.655179559" observedRunningTime="2026-04-16 20:18:55.57998975 +0000 UTC m=+427.016978258" watchObservedRunningTime="2026-04-16 20:18:55.580757198 +0000 UTC m=+427.017745705" Apr 16 20:19:01.550530 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:01.550457 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-pphr6" Apr 16 20:19:31.742562 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.742523 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-986f4df7-wvgks"] Apr 16 20:19:31.749252 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.749234 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:31.752862 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.752838 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 20:19:31.753622 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.753606 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 20:19:31.753721 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.753635 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-qc9df\"" Apr 16 20:19:31.753721 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.753647 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 20:19:31.753721 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.753646 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:19:31.753721 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.753641 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 20:19:31.757001 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.756770 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-986f4df7-wvgks"] Apr 16 20:19:31.888779 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.888743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e68d2697-1a8a-450e-8bbd-a1d3bd0decdd-metrics-cert\") pod \"lws-controller-manager-986f4df7-wvgks\" (UID: \"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:31.888985 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.888793 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx6wl\" (UniqueName: \"kubernetes.io/projected/e68d2697-1a8a-450e-8bbd-a1d3bd0decdd-kube-api-access-bx6wl\") pod \"lws-controller-manager-986f4df7-wvgks\" (UID: \"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:31.888985 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.888859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e68d2697-1a8a-450e-8bbd-a1d3bd0decdd-manager-config\") pod \"lws-controller-manager-986f4df7-wvgks\" (UID: \"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:31.888985 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.888971 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e68d2697-1a8a-450e-8bbd-a1d3bd0decdd-cert\") pod \"lws-controller-manager-986f4df7-wvgks\" (UID: \"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:31.989788 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.989753 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e68d2697-1a8a-450e-8bbd-a1d3bd0decdd-metrics-cert\") pod \"lws-controller-manager-986f4df7-wvgks\" (UID: \"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:31.990004 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.989797 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bx6wl\" (UniqueName: \"kubernetes.io/projected/e68d2697-1a8a-450e-8bbd-a1d3bd0decdd-kube-api-access-bx6wl\") pod \"lws-controller-manager-986f4df7-wvgks\" (UID: \"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:31.990004 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.989976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e68d2697-1a8a-450e-8bbd-a1d3bd0decdd-manager-config\") pod \"lws-controller-manager-986f4df7-wvgks\" (UID: \"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:31.990119 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.990083 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e68d2697-1a8a-450e-8bbd-a1d3bd0decdd-cert\") pod \"lws-controller-manager-986f4df7-wvgks\" (UID: \"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:31.990596 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.990574 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e68d2697-1a8a-450e-8bbd-a1d3bd0decdd-manager-config\") pod \"lws-controller-manager-986f4df7-wvgks\" (UID: \"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:31.992306 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.992278 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e68d2697-1a8a-450e-8bbd-a1d3bd0decdd-metrics-cert\") pod \"lws-controller-manager-986f4df7-wvgks\" (UID: \"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:31.992413 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.992346 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e68d2697-1a8a-450e-8bbd-a1d3bd0decdd-cert\") pod \"lws-controller-manager-986f4df7-wvgks\" (UID: \"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:31.999054 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:31.999025 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx6wl\" (UniqueName: \"kubernetes.io/projected/e68d2697-1a8a-450e-8bbd-a1d3bd0decdd-kube-api-access-bx6wl\") pod \"lws-controller-manager-986f4df7-wvgks\" (UID: \"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd\") " pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:32.060030 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:32.059986 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:32.185582 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:32.185557 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-986f4df7-wvgks"] Apr 16 20:19:32.187628 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:19:32.187591 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode68d2697_1a8a_450e_8bbd_a1d3bd0decdd.slice/crio-feee1ea2d5ef5d27ce03e295f30f039e6f2a3c0b7133378f782039c40cf1d307 WatchSource:0}: Error finding container feee1ea2d5ef5d27ce03e295f30f039e6f2a3c0b7133378f782039c40cf1d307: Status 404 returned error can't find the container with id feee1ea2d5ef5d27ce03e295f30f039e6f2a3c0b7133378f782039c40cf1d307 Apr 16 20:19:32.659723 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:32.659687 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" event={"ID":"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd","Type":"ContainerStarted","Data":"feee1ea2d5ef5d27ce03e295f30f039e6f2a3c0b7133378f782039c40cf1d307"} Apr 16 20:19:34.667404 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:34.667373 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" event={"ID":"e68d2697-1a8a-450e-8bbd-a1d3bd0decdd","Type":"ContainerStarted","Data":"495684ae66f48410de17680d5ec82e1b9a8dce1d16015d090e45c2445ceda0e1"} Apr 16 20:19:34.667758 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:34.667419 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:19:34.683784 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:34.683739 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" podStartSLOduration=1.366471106 podStartE2EDuration="3.683725316s" podCreationTimestamp="2026-04-16 20:19:31 +0000 UTC" firstStartedPulling="2026-04-16 20:19:32.189349027 +0000 UTC m=+463.626337513" lastFinishedPulling="2026-04-16 20:19:34.506603234 +0000 UTC m=+465.943591723" observedRunningTime="2026-04-16 20:19:34.681738456 +0000 UTC m=+466.118726975" watchObservedRunningTime="2026-04-16 20:19:34.683725316 +0000 UTC m=+466.120713824" Apr 16 20:19:45.673388 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:19:45.673354 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-986f4df7-wvgks" Apr 16 20:20:16.907081 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:16.907045 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb"] Apr 16 20:20:16.911682 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:16.911660 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb" Apr 16 20:20:16.917170 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:16.917148 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 20:20:16.917526 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:16.917511 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 20:20:16.917677 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:16.917661 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-4ktsw\"" Apr 16 20:20:16.923393 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:16.923369 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb"] Apr 16 20:20:17.058864 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:17.058836 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtw45\" (UniqueName: \"kubernetes.io/projected/080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf-kube-api-access-vtw45\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-xjsmb\" (UID: \"080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb" Apr 16 20:20:17.059036 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:17.058890 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-xjsmb\" (UID: \"080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb" Apr 16 20:20:17.160250 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:17.160163 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtw45\" (UniqueName: \"kubernetes.io/projected/080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf-kube-api-access-vtw45\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-xjsmb\" (UID: \"080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb" Apr 16 20:20:17.160250 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:17.160212 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-xjsmb\" (UID: \"080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb" Apr 16 20:20:17.160589 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:17.160567 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-xjsmb\" (UID: \"080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb" Apr 16 20:20:17.171311 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:17.171283 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtw45\" (UniqueName: \"kubernetes.io/projected/080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf-kube-api-access-vtw45\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-xjsmb\" (UID: \"080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb" Apr 16 20:20:17.222222 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:17.222196 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb" Apr 16 20:20:17.372431 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:17.372402 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb"] Apr 16 20:20:17.374169 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:20:17.374132 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod080624af_c2d1_4ce7_9bd4_ed7d3c4d61bf.slice/crio-cb50011a4303c0a6e3195525f8c24ed846e0b0f0497fba1fe8bd4b463159b4c2 WatchSource:0}: Error finding container cb50011a4303c0a6e3195525f8c24ed846e0b0f0497fba1fe8bd4b463159b4c2: Status 404 returned error can't find the container with id cb50011a4303c0a6e3195525f8c24ed846e0b0f0497fba1fe8bd4b463159b4c2 Apr 16 20:20:17.795337 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:17.795301 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb" event={"ID":"080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf","Type":"ContainerStarted","Data":"cb50011a4303c0a6e3195525f8c24ed846e0b0f0497fba1fe8bd4b463159b4c2"} Apr 16 20:20:22.814438 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:22.814404 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb" event={"ID":"080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf","Type":"ContainerStarted","Data":"9b96e15e405a4ded96a6184c49ff2d11117a7b73857760704a1ee0d9bda89b64"} Apr 16 20:20:22.814803 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:22.814510 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb" Apr 16 20:20:22.832284 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:22.832238 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb" podStartSLOduration=2.241630109 podStartE2EDuration="6.832225099s" podCreationTimestamp="2026-04-16 20:20:16 +0000 UTC" firstStartedPulling="2026-04-16 20:20:17.376664558 +0000 UTC m=+508.813653044" lastFinishedPulling="2026-04-16 20:20:21.967259548 +0000 UTC m=+513.404248034" observedRunningTime="2026-04-16 20:20:22.830687246 +0000 UTC m=+514.267675765" watchObservedRunningTime="2026-04-16 20:20:22.832225099 +0000 UTC m=+514.269213606" Apr 16 20:20:33.820020 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:20:33.819935 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-xjsmb" Apr 16 20:21:04.968322 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:04.968286 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2j79x"] Apr 16 20:21:04.971607 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:04.971586 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" Apr 16 20:21:04.973661 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:04.973633 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 20:21:04.973772 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:04.973674 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-2bn4s\"" Apr 16 20:21:04.980829 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:04.980805 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2j79x"] Apr 16 20:21:05.057558 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:05.057516 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqrzf\" (UniqueName: \"kubernetes.io/projected/8d24da8d-e364-4036-b1eb-d369585645e3-kube-api-access-fqrzf\") pod \"limitador-limitador-64c8f475fb-2j79x\" (UID: \"8d24da8d-e364-4036-b1eb-d369585645e3\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" Apr 16 20:21:05.057721 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:05.057613 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8d24da8d-e364-4036-b1eb-d369585645e3-config-file\") pod \"limitador-limitador-64c8f475fb-2j79x\" (UID: \"8d24da8d-e364-4036-b1eb-d369585645e3\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" Apr 16 20:21:05.069642 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:05.069608 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2j79x"] Apr 16 20:21:05.158814 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:05.158786 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqrzf\" (UniqueName: \"kubernetes.io/projected/8d24da8d-e364-4036-b1eb-d369585645e3-kube-api-access-fqrzf\") pod \"limitador-limitador-64c8f475fb-2j79x\" (UID: \"8d24da8d-e364-4036-b1eb-d369585645e3\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" Apr 16 20:21:05.159009 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:05.158905 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8d24da8d-e364-4036-b1eb-d369585645e3-config-file\") pod \"limitador-limitador-64c8f475fb-2j79x\" (UID: \"8d24da8d-e364-4036-b1eb-d369585645e3\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" Apr 16 20:21:05.159486 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:05.159466 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8d24da8d-e364-4036-b1eb-d369585645e3-config-file\") pod \"limitador-limitador-64c8f475fb-2j79x\" (UID: \"8d24da8d-e364-4036-b1eb-d369585645e3\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" Apr 16 20:21:05.166206 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:05.166183 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqrzf\" (UniqueName: \"kubernetes.io/projected/8d24da8d-e364-4036-b1eb-d369585645e3-kube-api-access-fqrzf\") pod \"limitador-limitador-64c8f475fb-2j79x\" (UID: \"8d24da8d-e364-4036-b1eb-d369585645e3\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" Apr 16 20:21:05.282962 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:05.282937 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" Apr 16 20:21:05.410507 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:05.410475 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2j79x"] Apr 16 20:21:05.413457 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:21:05.413425 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d24da8d_e364_4036_b1eb_d369585645e3.slice/crio-a819487602f97511bc44379639fc72327145c5b93e0af7b541aff705175b9789 WatchSource:0}: Error finding container a819487602f97511bc44379639fc72327145c5b93e0af7b541aff705175b9789: Status 404 returned error can't find the container with id a819487602f97511bc44379639fc72327145c5b93e0af7b541aff705175b9789 Apr 16 20:21:05.958568 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:05.958534 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" event={"ID":"8d24da8d-e364-4036-b1eb-d369585645e3","Type":"ContainerStarted","Data":"a819487602f97511bc44379639fc72327145c5b93e0af7b541aff705175b9789"} Apr 16 20:21:09.974929 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:09.974842 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" event={"ID":"8d24da8d-e364-4036-b1eb-d369585645e3","Type":"ContainerStarted","Data":"b7008bd563779e40d35b41216e9f859ecd70ed5015c9476cbe4f7cc6a8dbee39"} Apr 16 20:21:09.975249 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:09.974981 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" Apr 16 20:21:09.992599 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:09.992544 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" podStartSLOduration=1.695712291 podStartE2EDuration="5.992527014s" podCreationTimestamp="2026-04-16 20:21:04 +0000 UTC" firstStartedPulling="2026-04-16 20:21:05.41525871 +0000 UTC m=+556.852247210" lastFinishedPulling="2026-04-16 20:21:09.712073444 +0000 UTC m=+561.149061933" observedRunningTime="2026-04-16 20:21:09.991418613 +0000 UTC m=+561.428407121" watchObservedRunningTime="2026-04-16 20:21:09.992527014 +0000 UTC m=+561.429515522" Apr 16 20:21:20.938723 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:20.938687 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2j79x"] Apr 16 20:21:20.939147 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:20.938938 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" podUID="8d24da8d-e364-4036-b1eb-d369585645e3" containerName="limitador" containerID="cri-o://b7008bd563779e40d35b41216e9f859ecd70ed5015c9476cbe4f7cc6a8dbee39" gracePeriod=30 Apr 16 20:21:20.939653 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:20.939590 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" Apr 16 20:21:21.486640 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:21.486618 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" Apr 16 20:21:21.609945 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:21.609846 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8d24da8d-e364-4036-b1eb-d369585645e3-config-file\") pod \"8d24da8d-e364-4036-b1eb-d369585645e3\" (UID: \"8d24da8d-e364-4036-b1eb-d369585645e3\") " Apr 16 20:21:21.610093 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:21.610043 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqrzf\" (UniqueName: \"kubernetes.io/projected/8d24da8d-e364-4036-b1eb-d369585645e3-kube-api-access-fqrzf\") pod \"8d24da8d-e364-4036-b1eb-d369585645e3\" (UID: \"8d24da8d-e364-4036-b1eb-d369585645e3\") " Apr 16 20:21:21.610262 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:21.610239 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d24da8d-e364-4036-b1eb-d369585645e3-config-file" (OuterVolumeSpecName: "config-file") pod "8d24da8d-e364-4036-b1eb-d369585645e3" (UID: "8d24da8d-e364-4036-b1eb-d369585645e3"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:21:21.610324 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:21.610312 2572 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8d24da8d-e364-4036-b1eb-d369585645e3-config-file\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:21:21.612127 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:21.612104 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d24da8d-e364-4036-b1eb-d369585645e3-kube-api-access-fqrzf" (OuterVolumeSpecName: "kube-api-access-fqrzf") pod "8d24da8d-e364-4036-b1eb-d369585645e3" (UID: "8d24da8d-e364-4036-b1eb-d369585645e3"). InnerVolumeSpecName "kube-api-access-fqrzf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:21:21.711561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:21.711530 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fqrzf\" (UniqueName: \"kubernetes.io/projected/8d24da8d-e364-4036-b1eb-d369585645e3-kube-api-access-fqrzf\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:21:22.013971 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:22.013934 2572 generic.go:358] "Generic (PLEG): container finished" podID="8d24da8d-e364-4036-b1eb-d369585645e3" containerID="b7008bd563779e40d35b41216e9f859ecd70ed5015c9476cbe4f7cc6a8dbee39" exitCode=0 Apr 16 20:21:22.014413 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:22.013997 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" Apr 16 20:21:22.014413 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:22.014008 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" event={"ID":"8d24da8d-e364-4036-b1eb-d369585645e3","Type":"ContainerDied","Data":"b7008bd563779e40d35b41216e9f859ecd70ed5015c9476cbe4f7cc6a8dbee39"} Apr 16 20:21:22.014413 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:22.014054 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-2j79x" event={"ID":"8d24da8d-e364-4036-b1eb-d369585645e3","Type":"ContainerDied","Data":"a819487602f97511bc44379639fc72327145c5b93e0af7b541aff705175b9789"} Apr 16 20:21:22.014413 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:22.014080 2572 scope.go:117] "RemoveContainer" containerID="b7008bd563779e40d35b41216e9f859ecd70ed5015c9476cbe4f7cc6a8dbee39" Apr 16 20:21:22.022896 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:22.022854 2572 scope.go:117] "RemoveContainer" containerID="b7008bd563779e40d35b41216e9f859ecd70ed5015c9476cbe4f7cc6a8dbee39" Apr 16 20:21:22.023159 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:21:22.023137 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7008bd563779e40d35b41216e9f859ecd70ed5015c9476cbe4f7cc6a8dbee39\": container with ID starting with b7008bd563779e40d35b41216e9f859ecd70ed5015c9476cbe4f7cc6a8dbee39 not found: ID does not exist" containerID="b7008bd563779e40d35b41216e9f859ecd70ed5015c9476cbe4f7cc6a8dbee39" Apr 16 20:21:22.023221 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:22.023168 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7008bd563779e40d35b41216e9f859ecd70ed5015c9476cbe4f7cc6a8dbee39"} err="failed to get container status \"b7008bd563779e40d35b41216e9f859ecd70ed5015c9476cbe4f7cc6a8dbee39\": rpc error: code = NotFound desc = could not find container \"b7008bd563779e40d35b41216e9f859ecd70ed5015c9476cbe4f7cc6a8dbee39\": container with ID starting with b7008bd563779e40d35b41216e9f859ecd70ed5015c9476cbe4f7cc6a8dbee39 not found: ID does not exist" Apr 16 20:21:22.034854 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:22.034832 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2j79x"] Apr 16 20:21:22.037221 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:22.037200 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2j79x"] Apr 16 20:21:23.149023 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:23.148989 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d24da8d-e364-4036-b1eb-d369585645e3" path="/var/lib/kubelet/pods/8d24da8d-e364-4036-b1eb-d369585645e3/volumes" Apr 16 20:21:49.101496 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.100989 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:21:49.106149 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.106128 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:21:49.656091 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.656057 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6f8c758999-z8t8q"] Apr 16 20:21:49.656531 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.656517 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d24da8d-e364-4036-b1eb-d369585645e3" containerName="limitador" Apr 16 20:21:49.656578 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.656534 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d24da8d-e364-4036-b1eb-d369585645e3" containerName="limitador" Apr 16 20:21:49.656639 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.656628 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d24da8d-e364-4036-b1eb-d369585645e3" containerName="limitador" Apr 16 20:21:49.661429 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.661407 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" Apr 16 20:21:49.663741 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.663719 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 20:21:49.664019 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.664002 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-jlfzw\"" Apr 16 20:21:49.664894 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.664768 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 20:21:49.665031 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.665011 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 20:21:49.667187 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.667162 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6f8c758999-z8t8q"] Apr 16 20:21:49.682935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.682909 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-s6jzs"] Apr 16 20:21:49.686437 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.686418 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-s6jzs" Apr 16 20:21:49.688483 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.688463 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 20:21:49.688665 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.688651 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-kgz9p\"" Apr 16 20:21:49.696670 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.696646 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-s6jzs"] Apr 16 20:21:49.732880 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.732847 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1f254ec8-2398-4afe-a00a-67591c7a1251-data\") pod \"seaweedfs-86cc847c5c-s6jzs\" (UID: \"1f254ec8-2398-4afe-a00a-67591c7a1251\") " pod="kserve/seaweedfs-86cc847c5c-s6jzs" Apr 16 20:21:49.733034 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.732915 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w52q\" (UniqueName: \"kubernetes.io/projected/b9555665-89e1-4156-be4c-187e0712741e-kube-api-access-5w52q\") pod \"llmisvc-controller-manager-6f8c758999-z8t8q\" (UID: \"b9555665-89e1-4156-be4c-187e0712741e\") " pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" Apr 16 20:21:49.733034 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.732958 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9555665-89e1-4156-be4c-187e0712741e-cert\") pod \"llmisvc-controller-manager-6f8c758999-z8t8q\" (UID: \"b9555665-89e1-4156-be4c-187e0712741e\") " pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" Apr 16 20:21:49.733034 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.732995 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85tx4\" (UniqueName: \"kubernetes.io/projected/1f254ec8-2398-4afe-a00a-67591c7a1251-kube-api-access-85tx4\") pod \"seaweedfs-86cc847c5c-s6jzs\" (UID: \"1f254ec8-2398-4afe-a00a-67591c7a1251\") " pod="kserve/seaweedfs-86cc847c5c-s6jzs" Apr 16 20:21:49.834309 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.834276 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1f254ec8-2398-4afe-a00a-67591c7a1251-data\") pod \"seaweedfs-86cc847c5c-s6jzs\" (UID: \"1f254ec8-2398-4afe-a00a-67591c7a1251\") " pod="kserve/seaweedfs-86cc847c5c-s6jzs" Apr 16 20:21:49.834309 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.834308 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5w52q\" (UniqueName: \"kubernetes.io/projected/b9555665-89e1-4156-be4c-187e0712741e-kube-api-access-5w52q\") pod \"llmisvc-controller-manager-6f8c758999-z8t8q\" (UID: \"b9555665-89e1-4156-be4c-187e0712741e\") " pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" Apr 16 20:21:49.834530 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.834334 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9555665-89e1-4156-be4c-187e0712741e-cert\") pod \"llmisvc-controller-manager-6f8c758999-z8t8q\" (UID: \"b9555665-89e1-4156-be4c-187e0712741e\") " pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" Apr 16 20:21:49.834530 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.834364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85tx4\" (UniqueName: \"kubernetes.io/projected/1f254ec8-2398-4afe-a00a-67591c7a1251-kube-api-access-85tx4\") pod \"seaweedfs-86cc847c5c-s6jzs\" (UID: \"1f254ec8-2398-4afe-a00a-67591c7a1251\") " pod="kserve/seaweedfs-86cc847c5c-s6jzs" Apr 16 20:21:49.834695 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.834672 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1f254ec8-2398-4afe-a00a-67591c7a1251-data\") pod \"seaweedfs-86cc847c5c-s6jzs\" (UID: \"1f254ec8-2398-4afe-a00a-67591c7a1251\") " pod="kserve/seaweedfs-86cc847c5c-s6jzs" Apr 16 20:21:49.836707 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.836688 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9555665-89e1-4156-be4c-187e0712741e-cert\") pod \"llmisvc-controller-manager-6f8c758999-z8t8q\" (UID: \"b9555665-89e1-4156-be4c-187e0712741e\") " pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" Apr 16 20:21:49.842744 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.842723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w52q\" (UniqueName: \"kubernetes.io/projected/b9555665-89e1-4156-be4c-187e0712741e-kube-api-access-5w52q\") pod \"llmisvc-controller-manager-6f8c758999-z8t8q\" (UID: \"b9555665-89e1-4156-be4c-187e0712741e\") " pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" Apr 16 20:21:49.842894 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.842795 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85tx4\" (UniqueName: \"kubernetes.io/projected/1f254ec8-2398-4afe-a00a-67591c7a1251-kube-api-access-85tx4\") pod \"seaweedfs-86cc847c5c-s6jzs\" (UID: \"1f254ec8-2398-4afe-a00a-67591c7a1251\") " pod="kserve/seaweedfs-86cc847c5c-s6jzs" Apr 16 20:21:49.973336 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.973260 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" Apr 16 20:21:49.997051 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:49.997025 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-s6jzs" Apr 16 20:21:50.108748 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:50.108723 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6f8c758999-z8t8q"] Apr 16 20:21:50.111072 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:21:50.111022 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb9555665_89e1_4156_be4c_187e0712741e.slice/crio-201044aeaf9c8b66f88a4940817b77657d9bee52dd1eae5fbd320f09755d718e WatchSource:0}: Error finding container 201044aeaf9c8b66f88a4940817b77657d9bee52dd1eae5fbd320f09755d718e: Status 404 returned error can't find the container with id 201044aeaf9c8b66f88a4940817b77657d9bee52dd1eae5fbd320f09755d718e Apr 16 20:21:50.129926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:50.129904 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-s6jzs"] Apr 16 20:21:50.131808 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:21:50.131778 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f254ec8_2398_4afe_a00a_67591c7a1251.slice/crio-f9650e01102c02cbffc27d0722f096bf587552ba28e226bf6b9e906436ebf519 WatchSource:0}: Error finding container f9650e01102c02cbffc27d0722f096bf587552ba28e226bf6b9e906436ebf519: Status 404 returned error can't find the container with id f9650e01102c02cbffc27d0722f096bf587552ba28e226bf6b9e906436ebf519 Apr 16 20:21:51.111146 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:51.111104 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" event={"ID":"b9555665-89e1-4156-be4c-187e0712741e","Type":"ContainerStarted","Data":"201044aeaf9c8b66f88a4940817b77657d9bee52dd1eae5fbd320f09755d718e"} Apr 16 20:21:51.112493 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:51.112462 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-s6jzs" event={"ID":"1f254ec8-2398-4afe-a00a-67591c7a1251","Type":"ContainerStarted","Data":"f9650e01102c02cbffc27d0722f096bf587552ba28e226bf6b9e906436ebf519"} Apr 16 20:21:53.121583 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:53.121550 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-s6jzs" event={"ID":"1f254ec8-2398-4afe-a00a-67591c7a1251","Type":"ContainerStarted","Data":"8bcb1c14bf77e5019707e43f185712fe70161c604db3548c81c76d21c9af3eb1"} Apr 16 20:21:53.122040 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:53.121682 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-s6jzs" Apr 16 20:21:53.137003 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:53.136950 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-s6jzs" podStartSLOduration=1.5369131280000001 podStartE2EDuration="4.136930597s" podCreationTimestamp="2026-04-16 20:21:49 +0000 UTC" firstStartedPulling="2026-04-16 20:21:50.133120636 +0000 UTC m=+601.570109123" lastFinishedPulling="2026-04-16 20:21:52.733138103 +0000 UTC m=+604.170126592" observedRunningTime="2026-04-16 20:21:53.135741497 +0000 UTC m=+604.572730004" watchObservedRunningTime="2026-04-16 20:21:53.136930597 +0000 UTC m=+604.573919105" Apr 16 20:21:54.126638 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:54.126600 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" event={"ID":"b9555665-89e1-4156-be4c-187e0712741e","Type":"ContainerStarted","Data":"1a51da3ecf8a5c9097f993af75c6b4eaa230e07cf7df93713d68f525d12987d3"} Apr 16 20:21:54.141706 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:54.141650 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" podStartSLOduration=1.616671079 podStartE2EDuration="5.141633182s" podCreationTimestamp="2026-04-16 20:21:49 +0000 UTC" firstStartedPulling="2026-04-16 20:21:50.112390504 +0000 UTC m=+601.549378990" lastFinishedPulling="2026-04-16 20:21:53.637352597 +0000 UTC m=+605.074341093" observedRunningTime="2026-04-16 20:21:54.140838128 +0000 UTC m=+605.577826630" watchObservedRunningTime="2026-04-16 20:21:54.141633182 +0000 UTC m=+605.578621691" Apr 16 20:21:55.130269 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:55.130232 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" Apr 16 20:21:59.128699 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:21:59.128620 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-s6jzs" Apr 16 20:22:26.135189 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:22:26.135155 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" Apr 16 20:23:00.776260 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:00.776220 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-p4pqc"] Apr 16 20:23:00.779447 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:00.779427 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-p4pqc" Apr 16 20:23:00.781750 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:00.781725 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-knpcs\"" Apr 16 20:23:00.781863 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:00.781770 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 20:23:00.789778 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:00.789755 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-p4pqc"] Apr 16 20:23:00.829102 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:00.829075 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12141ef2-0d8f-4ea5-aa93-d6464fa4059f-tls-certs\") pod \"model-serving-api-86f7b4b499-p4pqc\" (UID: \"12141ef2-0d8f-4ea5-aa93-d6464fa4059f\") " pod="kserve/model-serving-api-86f7b4b499-p4pqc" Apr 16 20:23:00.829230 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:00.829108 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98srk\" (UniqueName: \"kubernetes.io/projected/12141ef2-0d8f-4ea5-aa93-d6464fa4059f-kube-api-access-98srk\") pod \"model-serving-api-86f7b4b499-p4pqc\" (UID: \"12141ef2-0d8f-4ea5-aa93-d6464fa4059f\") " pod="kserve/model-serving-api-86f7b4b499-p4pqc" Apr 16 20:23:00.930397 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:00.930368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12141ef2-0d8f-4ea5-aa93-d6464fa4059f-tls-certs\") pod \"model-serving-api-86f7b4b499-p4pqc\" (UID: \"12141ef2-0d8f-4ea5-aa93-d6464fa4059f\") " pod="kserve/model-serving-api-86f7b4b499-p4pqc" Apr 16 20:23:00.930397 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:00.930405 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98srk\" (UniqueName: \"kubernetes.io/projected/12141ef2-0d8f-4ea5-aa93-d6464fa4059f-kube-api-access-98srk\") pod \"model-serving-api-86f7b4b499-p4pqc\" (UID: \"12141ef2-0d8f-4ea5-aa93-d6464fa4059f\") " pod="kserve/model-serving-api-86f7b4b499-p4pqc" Apr 16 20:23:00.932676 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:00.932653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12141ef2-0d8f-4ea5-aa93-d6464fa4059f-tls-certs\") pod \"model-serving-api-86f7b4b499-p4pqc\" (UID: \"12141ef2-0d8f-4ea5-aa93-d6464fa4059f\") " pod="kserve/model-serving-api-86f7b4b499-p4pqc" Apr 16 20:23:00.938431 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:00.938404 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98srk\" (UniqueName: \"kubernetes.io/projected/12141ef2-0d8f-4ea5-aa93-d6464fa4059f-kube-api-access-98srk\") pod \"model-serving-api-86f7b4b499-p4pqc\" (UID: \"12141ef2-0d8f-4ea5-aa93-d6464fa4059f\") " pod="kserve/model-serving-api-86f7b4b499-p4pqc" Apr 16 20:23:01.093005 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:01.092918 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-p4pqc" Apr 16 20:23:01.212118 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:01.212086 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-p4pqc"] Apr 16 20:23:01.213477 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:23:01.213441 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12141ef2_0d8f_4ea5_aa93_d6464fa4059f.slice/crio-87553320e8fde9c207b69b0555178d0384371c3bdbcac57f71f3d4c594cc8a8b WatchSource:0}: Error finding container 87553320e8fde9c207b69b0555178d0384371c3bdbcac57f71f3d4c594cc8a8b: Status 404 returned error can't find the container with id 87553320e8fde9c207b69b0555178d0384371c3bdbcac57f71f3d4c594cc8a8b Apr 16 20:23:01.339650 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:01.339613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-p4pqc" event={"ID":"12141ef2-0d8f-4ea5-aa93-d6464fa4059f","Type":"ContainerStarted","Data":"87553320e8fde9c207b69b0555178d0384371c3bdbcac57f71f3d4c594cc8a8b"} Apr 16 20:23:04.351656 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:04.351617 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-p4pqc" event={"ID":"12141ef2-0d8f-4ea5-aa93-d6464fa4059f","Type":"ContainerStarted","Data":"4e033b28a3dcf50f0e0ac02826505be53d16ccddd88efa7cf78c84953e1fe550"} Apr 16 20:23:04.352084 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:04.351737 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-p4pqc" Apr 16 20:23:04.368291 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:04.368242 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-p4pqc" podStartSLOduration=2.11036893 podStartE2EDuration="4.368230914s" podCreationTimestamp="2026-04-16 20:23:00 +0000 UTC" firstStartedPulling="2026-04-16 20:23:01.221321379 +0000 UTC m=+672.658309882" lastFinishedPulling="2026-04-16 20:23:03.479183376 +0000 UTC m=+674.916171866" observedRunningTime="2026-04-16 20:23:04.367124023 +0000 UTC m=+675.804112531" watchObservedRunningTime="2026-04-16 20:23:04.368230914 +0000 UTC m=+675.805219443" Apr 16 20:23:15.358798 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:15.358770 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-p4pqc" Apr 16 20:23:15.821084 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:15.821049 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-mfsr7"] Apr 16 20:23:15.824453 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:15.824436 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mfsr7" Apr 16 20:23:15.831527 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:15.831189 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-mfsr7"] Apr 16 20:23:15.968312 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:15.968274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29mhr\" (UniqueName: \"kubernetes.io/projected/59ac456f-4f01-491a-8b3a-4a12c727c4df-kube-api-access-29mhr\") pod \"s3-init-mfsr7\" (UID: \"59ac456f-4f01-491a-8b3a-4a12c727c4df\") " pod="kserve/s3-init-mfsr7" Apr 16 20:23:16.069479 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:16.069440 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29mhr\" (UniqueName: \"kubernetes.io/projected/59ac456f-4f01-491a-8b3a-4a12c727c4df-kube-api-access-29mhr\") pod \"s3-init-mfsr7\" (UID: \"59ac456f-4f01-491a-8b3a-4a12c727c4df\") " pod="kserve/s3-init-mfsr7" Apr 16 20:23:16.077165 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:16.077104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29mhr\" (UniqueName: \"kubernetes.io/projected/59ac456f-4f01-491a-8b3a-4a12c727c4df-kube-api-access-29mhr\") pod \"s3-init-mfsr7\" (UID: \"59ac456f-4f01-491a-8b3a-4a12c727c4df\") " pod="kserve/s3-init-mfsr7" Apr 16 20:23:16.134666 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:16.134644 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mfsr7" Apr 16 20:23:16.248370 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:16.248340 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-mfsr7"] Apr 16 20:23:16.252086 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:23:16.252054 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59ac456f_4f01_491a_8b3a_4a12c727c4df.slice/crio-16de7219deb5642e5695beddbad01afeda03d093f3ab040d1a65fbf57038e713 WatchSource:0}: Error finding container 16de7219deb5642e5695beddbad01afeda03d093f3ab040d1a65fbf57038e713: Status 404 returned error can't find the container with id 16de7219deb5642e5695beddbad01afeda03d093f3ab040d1a65fbf57038e713 Apr 16 20:23:16.393011 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:16.392935 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mfsr7" event={"ID":"59ac456f-4f01-491a-8b3a-4a12c727c4df","Type":"ContainerStarted","Data":"16de7219deb5642e5695beddbad01afeda03d093f3ab040d1a65fbf57038e713"} Apr 16 20:23:21.413089 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:21.413050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mfsr7" event={"ID":"59ac456f-4f01-491a-8b3a-4a12c727c4df","Type":"ContainerStarted","Data":"2c64d9426b6271601e5d842535152f697ecf2c01c69a0a56e7f9bdb8efbf8aba"} Apr 16 20:23:21.428257 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:21.428186 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-mfsr7" podStartSLOduration=1.969335972 podStartE2EDuration="6.42816396s" podCreationTimestamp="2026-04-16 20:23:15 +0000 UTC" firstStartedPulling="2026-04-16 20:23:16.25392724 +0000 UTC m=+687.690915726" lastFinishedPulling="2026-04-16 20:23:20.712755228 +0000 UTC m=+692.149743714" observedRunningTime="2026-04-16 20:23:21.428041253 +0000 UTC m=+692.865029760" watchObservedRunningTime="2026-04-16 20:23:21.42816396 +0000 UTC m=+692.865152469" Apr 16 20:23:24.425096 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:24.425057 2572 generic.go:358] "Generic (PLEG): container finished" podID="59ac456f-4f01-491a-8b3a-4a12c727c4df" containerID="2c64d9426b6271601e5d842535152f697ecf2c01c69a0a56e7f9bdb8efbf8aba" exitCode=0 Apr 16 20:23:24.425423 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:24.425133 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mfsr7" event={"ID":"59ac456f-4f01-491a-8b3a-4a12c727c4df","Type":"ContainerDied","Data":"2c64d9426b6271601e5d842535152f697ecf2c01c69a0a56e7f9bdb8efbf8aba"} Apr 16 20:23:25.562052 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:25.562024 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mfsr7" Apr 16 20:23:25.660375 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:25.660341 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29mhr\" (UniqueName: \"kubernetes.io/projected/59ac456f-4f01-491a-8b3a-4a12c727c4df-kube-api-access-29mhr\") pod \"59ac456f-4f01-491a-8b3a-4a12c727c4df\" (UID: \"59ac456f-4f01-491a-8b3a-4a12c727c4df\") " Apr 16 20:23:25.662475 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:25.662439 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ac456f-4f01-491a-8b3a-4a12c727c4df-kube-api-access-29mhr" (OuterVolumeSpecName: "kube-api-access-29mhr") pod "59ac456f-4f01-491a-8b3a-4a12c727c4df" (UID: "59ac456f-4f01-491a-8b3a-4a12c727c4df"). InnerVolumeSpecName "kube-api-access-29mhr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:23:25.761822 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:25.761791 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-29mhr\" (UniqueName: \"kubernetes.io/projected/59ac456f-4f01-491a-8b3a-4a12c727c4df-kube-api-access-29mhr\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:23:26.432527 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:26.432501 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mfsr7" Apr 16 20:23:26.432686 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:26.432530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mfsr7" event={"ID":"59ac456f-4f01-491a-8b3a-4a12c727c4df","Type":"ContainerDied","Data":"16de7219deb5642e5695beddbad01afeda03d093f3ab040d1a65fbf57038e713"} Apr 16 20:23:26.432686 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:26.432558 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16de7219deb5642e5695beddbad01afeda03d093f3ab040d1a65fbf57038e713" Apr 16 20:23:39.612248 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.612214 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb"] Apr 16 20:23:39.612598 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.612587 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59ac456f-4f01-491a-8b3a-4a12c727c4df" containerName="s3-init" Apr 16 20:23:39.612635 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.612599 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ac456f-4f01-491a-8b3a-4a12c727c4df" containerName="s3-init" Apr 16 20:23:39.612669 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.612663 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="59ac456f-4f01-491a-8b3a-4a12c727c4df" containerName="s3-init" Apr 16 20:23:39.620355 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.620332 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.622766 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.622737 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:23:39.622919 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.622746 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:23:39.623455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.623438 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:23:39.623535 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.623461 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 20:23:39.626657 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.626634 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb"] Apr 16 20:23:39.677205 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.677178 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65z5v\" (UniqueName: \"kubernetes.io/projected/66414c6b-03de-458f-8fad-eabef52fc0e2-kube-api-access-65z5v\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.677325 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.677215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-model-cache\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.677325 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.677243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66414c6b-03de-458f-8fad-eabef52fc0e2-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.677325 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.677318 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-dshm\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.677464 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.677344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.677464 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.677370 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-home\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.778395 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.778363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-model-cache\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.778521 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.778402 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66414c6b-03de-458f-8fad-eabef52fc0e2-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.778521 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.778434 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-dshm\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.778521 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.778470 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.778521 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.778501 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-home\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.778696 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.778549 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65z5v\" (UniqueName: \"kubernetes.io/projected/66414c6b-03de-458f-8fad-eabef52fc0e2-kube-api-access-65z5v\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.778927 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.778906 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-model-cache\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.779020 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.778944 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.779020 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.778977 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-home\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.780700 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.780678 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-dshm\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.780985 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.780966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66414c6b-03de-458f-8fad-eabef52fc0e2-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.786912 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.786887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65z5v\" (UniqueName: \"kubernetes.io/projected/66414c6b-03de-458f-8fad-eabef52fc0e2-kube-api-access-65z5v\") pod \"scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:39.931959 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:39.931919 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:40.068478 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:40.068434 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb"] Apr 16 20:23:40.487978 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:40.487936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" event={"ID":"66414c6b-03de-458f-8fad-eabef52fc0e2","Type":"ContainerStarted","Data":"419893e4ff9b2086b16787a1264112a25948b2d9bcbccdc908c57953c170dcb0"} Apr 16 20:23:43.499107 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:43.499071 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" event={"ID":"66414c6b-03de-458f-8fad-eabef52fc0e2","Type":"ContainerStarted","Data":"e85f16eb7f75f13e76fbc4efc09072db3e6bf40984fc0520bbfbdb5c7a0dab37"} Apr 16 20:23:48.515778 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:48.515740 2572 generic.go:358] "Generic (PLEG): container finished" podID="66414c6b-03de-458f-8fad-eabef52fc0e2" containerID="e85f16eb7f75f13e76fbc4efc09072db3e6bf40984fc0520bbfbdb5c7a0dab37" exitCode=0 Apr 16 20:23:48.516176 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:48.515817 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" event={"ID":"66414c6b-03de-458f-8fad-eabef52fc0e2","Type":"ContainerDied","Data":"e85f16eb7f75f13e76fbc4efc09072db3e6bf40984fc0520bbfbdb5c7a0dab37"} Apr 16 20:23:50.524861 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:50.524823 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" event={"ID":"66414c6b-03de-458f-8fad-eabef52fc0e2","Type":"ContainerStarted","Data":"016f097fcd2b1caf21f3ac972eb199b12f056e7da83fb94f4c42190360df1d17"} Apr 16 20:23:50.543285 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:50.543240 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" podStartSLOduration=1.915074609 podStartE2EDuration="11.543227034s" podCreationTimestamp="2026-04-16 20:23:39 +0000 UTC" firstStartedPulling="2026-04-16 20:23:40.081742895 +0000 UTC m=+711.518731381" lastFinishedPulling="2026-04-16 20:23:49.709895316 +0000 UTC m=+721.146883806" observedRunningTime="2026-04-16 20:23:50.540904904 +0000 UTC m=+721.977893407" watchObservedRunningTime="2026-04-16 20:23:50.543227034 +0000 UTC m=+721.980215542" Apr 16 20:23:59.932502 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:59.932468 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:59.932915 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:59.932512 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:23:59.945192 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:23:59.945165 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:24:00.567608 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:00.567574 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:24:31.805640 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:31.805605 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb"] Apr 16 20:24:31.806087 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:31.806022 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" podUID="66414c6b-03de-458f-8fad-eabef52fc0e2" containerName="main" containerID="cri-o://016f097fcd2b1caf21f3ac972eb199b12f056e7da83fb94f4c42190360df1d17" gracePeriod=30 Apr 16 20:24:32.063674 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.063613 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:24:32.144712 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.144678 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-dshm\") pod \"66414c6b-03de-458f-8fad-eabef52fc0e2\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " Apr 16 20:24:32.144916 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.144749 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-kserve-provision-location\") pod \"66414c6b-03de-458f-8fad-eabef52fc0e2\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " Apr 16 20:24:32.144916 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.144781 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66414c6b-03de-458f-8fad-eabef52fc0e2-tls-certs\") pod \"66414c6b-03de-458f-8fad-eabef52fc0e2\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " Apr 16 20:24:32.144916 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.144816 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65z5v\" (UniqueName: \"kubernetes.io/projected/66414c6b-03de-458f-8fad-eabef52fc0e2-kube-api-access-65z5v\") pod \"66414c6b-03de-458f-8fad-eabef52fc0e2\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " Apr 16 20:24:32.144916 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.144833 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-home\") pod \"66414c6b-03de-458f-8fad-eabef52fc0e2\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " Apr 16 20:24:32.144916 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.144899 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-model-cache\") pod \"66414c6b-03de-458f-8fad-eabef52fc0e2\" (UID: \"66414c6b-03de-458f-8fad-eabef52fc0e2\") " Apr 16 20:24:32.145205 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.145095 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-home" (OuterVolumeSpecName: "home") pod "66414c6b-03de-458f-8fad-eabef52fc0e2" (UID: "66414c6b-03de-458f-8fad-eabef52fc0e2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:32.145343 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.145315 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-model-cache" (OuterVolumeSpecName: "model-cache") pod "66414c6b-03de-458f-8fad-eabef52fc0e2" (UID: "66414c6b-03de-458f-8fad-eabef52fc0e2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:32.147149 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.147092 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66414c6b-03de-458f-8fad-eabef52fc0e2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "66414c6b-03de-458f-8fad-eabef52fc0e2" (UID: "66414c6b-03de-458f-8fad-eabef52fc0e2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:24:32.147149 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.147127 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-dshm" (OuterVolumeSpecName: "dshm") pod "66414c6b-03de-458f-8fad-eabef52fc0e2" (UID: "66414c6b-03de-458f-8fad-eabef52fc0e2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:32.147315 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.147211 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66414c6b-03de-458f-8fad-eabef52fc0e2-kube-api-access-65z5v" (OuterVolumeSpecName: "kube-api-access-65z5v") pod "66414c6b-03de-458f-8fad-eabef52fc0e2" (UID: "66414c6b-03de-458f-8fad-eabef52fc0e2"). InnerVolumeSpecName "kube-api-access-65z5v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:24:32.204365 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.204319 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "66414c6b-03de-458f-8fad-eabef52fc0e2" (UID: "66414c6b-03de-458f-8fad-eabef52fc0e2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:32.246108 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.246074 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:24:32.246108 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.246108 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66414c6b-03de-458f-8fad-eabef52fc0e2-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:24:32.246290 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.246119 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65z5v\" (UniqueName: \"kubernetes.io/projected/66414c6b-03de-458f-8fad-eabef52fc0e2-kube-api-access-65z5v\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:24:32.246290 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.246128 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-home\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:24:32.246290 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.246137 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-model-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:24:32.246290 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.246145 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/66414c6b-03de-458f-8fad-eabef52fc0e2-dshm\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:24:32.660505 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.660472 2572 generic.go:358] "Generic (PLEG): container finished" podID="66414c6b-03de-458f-8fad-eabef52fc0e2" containerID="016f097fcd2b1caf21f3ac972eb199b12f056e7da83fb94f4c42190360df1d17" exitCode=0 Apr 16 20:24:32.660675 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.660537 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" Apr 16 20:24:32.660675 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.660533 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" event={"ID":"66414c6b-03de-458f-8fad-eabef52fc0e2","Type":"ContainerDied","Data":"016f097fcd2b1caf21f3ac972eb199b12f056e7da83fb94f4c42190360df1d17"} Apr 16 20:24:32.660675 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.660644 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb" event={"ID":"66414c6b-03de-458f-8fad-eabef52fc0e2","Type":"ContainerDied","Data":"419893e4ff9b2086b16787a1264112a25948b2d9bcbccdc908c57953c170dcb0"} Apr 16 20:24:32.660675 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.660661 2572 scope.go:117] "RemoveContainer" containerID="016f097fcd2b1caf21f3ac972eb199b12f056e7da83fb94f4c42190360df1d17" Apr 16 20:24:32.669485 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.669468 2572 scope.go:117] "RemoveContainer" containerID="e85f16eb7f75f13e76fbc4efc09072db3e6bf40984fc0520bbfbdb5c7a0dab37" Apr 16 20:24:32.679356 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.679337 2572 scope.go:117] "RemoveContainer" containerID="016f097fcd2b1caf21f3ac972eb199b12f056e7da83fb94f4c42190360df1d17" Apr 16 20:24:32.679693 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:24:32.679664 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016f097fcd2b1caf21f3ac972eb199b12f056e7da83fb94f4c42190360df1d17\": container with ID starting with 016f097fcd2b1caf21f3ac972eb199b12f056e7da83fb94f4c42190360df1d17 not found: ID does not exist" containerID="016f097fcd2b1caf21f3ac972eb199b12f056e7da83fb94f4c42190360df1d17" Apr 16 20:24:32.679777 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.679705 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016f097fcd2b1caf21f3ac972eb199b12f056e7da83fb94f4c42190360df1d17"} err="failed to get container status \"016f097fcd2b1caf21f3ac972eb199b12f056e7da83fb94f4c42190360df1d17\": rpc error: code = NotFound desc = could not find container \"016f097fcd2b1caf21f3ac972eb199b12f056e7da83fb94f4c42190360df1d17\": container with ID starting with 016f097fcd2b1caf21f3ac972eb199b12f056e7da83fb94f4c42190360df1d17 not found: ID does not exist" Apr 16 20:24:32.679777 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.679730 2572 scope.go:117] "RemoveContainer" containerID="e85f16eb7f75f13e76fbc4efc09072db3e6bf40984fc0520bbfbdb5c7a0dab37" Apr 16 20:24:32.680054 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:24:32.680035 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e85f16eb7f75f13e76fbc4efc09072db3e6bf40984fc0520bbfbdb5c7a0dab37\": container with ID starting with e85f16eb7f75f13e76fbc4efc09072db3e6bf40984fc0520bbfbdb5c7a0dab37 not found: ID does not exist" containerID="e85f16eb7f75f13e76fbc4efc09072db3e6bf40984fc0520bbfbdb5c7a0dab37" Apr 16 20:24:32.680104 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.680061 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e85f16eb7f75f13e76fbc4efc09072db3e6bf40984fc0520bbfbdb5c7a0dab37"} err="failed to get container status \"e85f16eb7f75f13e76fbc4efc09072db3e6bf40984fc0520bbfbdb5c7a0dab37\": rpc error: code = NotFound desc = could not find container \"e85f16eb7f75f13e76fbc4efc09072db3e6bf40984fc0520bbfbdb5c7a0dab37\": container with ID starting with e85f16eb7f75f13e76fbc4efc09072db3e6bf40984fc0520bbfbdb5c7a0dab37 not found: ID does not exist" Apr 16 20:24:32.681103 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.681084 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb"] Apr 16 20:24:32.685083 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:32.685061 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6497998b7d-dxrhb"] Apr 16 20:24:33.149621 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:33.149584 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66414c6b-03de-458f-8fad-eabef52fc0e2" path="/var/lib/kubelet/pods/66414c6b-03de-458f-8fad-eabef52fc0e2/volumes" Apr 16 20:24:38.020726 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.020686 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz"] Apr 16 20:24:38.021200 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.021054 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66414c6b-03de-458f-8fad-eabef52fc0e2" containerName="main" Apr 16 20:24:38.021200 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.021065 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="66414c6b-03de-458f-8fad-eabef52fc0e2" containerName="main" Apr 16 20:24:38.021200 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.021074 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66414c6b-03de-458f-8fad-eabef52fc0e2" containerName="storage-initializer" Apr 16 20:24:38.021200 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.021080 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="66414c6b-03de-458f-8fad-eabef52fc0e2" containerName="storage-initializer" Apr 16 20:24:38.021200 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.021142 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="66414c6b-03de-458f-8fad-eabef52fc0e2" containerName="main" Apr 16 20:24:38.026222 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.026200 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.028393 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.028365 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:24:38.028517 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.028438 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:24:38.029191 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.029176 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 20:24:38.029261 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.029176 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:24:38.032266 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.032243 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz"] Apr 16 20:24:38.098426 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.098393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-home\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.098426 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.098435 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.098681 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.098474 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-dshm\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.098681 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.098514 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-model-cache\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.098681 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.098536 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kcqx\" (UniqueName: \"kubernetes.io/projected/b39614b3-442e-422a-8819-a35b537cdcb8-kube-api-access-7kcqx\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.098681 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.098604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b39614b3-442e-422a-8819-a35b537cdcb8-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.199727 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.199688 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-home\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.199727 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.199730 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.200018 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.199773 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-dshm\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.200018 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.199802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-model-cache\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.200018 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.199826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kcqx\" (UniqueName: \"kubernetes.io/projected/b39614b3-442e-422a-8819-a35b537cdcb8-kube-api-access-7kcqx\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.200018 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.199901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b39614b3-442e-422a-8819-a35b537cdcb8-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.200224 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.200184 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-home\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.200224 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.200208 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.200328 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.200256 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-model-cache\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.202223 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.202197 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-dshm\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.202500 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.202481 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b39614b3-442e-422a-8819-a35b537cdcb8-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.211426 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.211402 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kcqx\" (UniqueName: \"kubernetes.io/projected/b39614b3-442e-422a-8819-a35b537cdcb8-kube-api-access-7kcqx\") pod \"scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.283811 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.283728 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj"] Apr 16 20:24:38.287551 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.287527 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.289922 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.289902 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-cqssp\"" Apr 16 20:24:38.301969 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.301945 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj"] Apr 16 20:24:38.337396 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.337362 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:38.401460 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.401422 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.401639 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.401488 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3adf594-1955-4674-937f-634bf4825316-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.401639 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.401548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.401639 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.401598 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.401639 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.401629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.401813 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.401658 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88qxg\" (UniqueName: \"kubernetes.io/projected/c3adf594-1955-4674-937f-634bf4825316-kube-api-access-88qxg\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.463134 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.463108 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz"] Apr 16 20:24:38.465183 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:24:38.465155 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb39614b3_442e_422a_8819_a35b537cdcb8.slice/crio-b1cc0ffda0a04449de0af664aad2e354472260ec3fc08ed5b66b41c953128181 WatchSource:0}: Error finding container b1cc0ffda0a04449de0af664aad2e354472260ec3fc08ed5b66b41c953128181: Status 404 returned error can't find the container with id b1cc0ffda0a04449de0af664aad2e354472260ec3fc08ed5b66b41c953128181 Apr 16 20:24:38.466750 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.466733 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:24:38.502382 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.502351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.502503 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.502392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.502503 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.502421 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88qxg\" (UniqueName: \"kubernetes.io/projected/c3adf594-1955-4674-937f-634bf4825316-kube-api-access-88qxg\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.502595 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.502502 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.502595 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.502551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3adf594-1955-4674-937f-634bf4825316-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.502698 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.502621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.502806 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.502786 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.502864 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.502832 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.502941 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.502889 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.502994 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.502974 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.505603 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.505575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3adf594-1955-4674-937f-634bf4825316-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.511329 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.511305 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88qxg\" (UniqueName: \"kubernetes.io/projected/c3adf594-1955-4674-937f-634bf4825316-kube-api-access-88qxg\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.597424 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.597386 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:24:38.684280 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.684240 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" event={"ID":"b39614b3-442e-422a-8819-a35b537cdcb8","Type":"ContainerStarted","Data":"df53dce03c514fafca3eb168a77c32ebb2a72b01f151f18a4753e9369a3e5864"} Apr 16 20:24:38.684408 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.684292 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" event={"ID":"b39614b3-442e-422a-8819-a35b537cdcb8","Type":"ContainerStarted","Data":"b1cc0ffda0a04449de0af664aad2e354472260ec3fc08ed5b66b41c953128181"} Apr 16 20:24:38.734777 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:38.734749 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj"] Apr 16 20:24:38.738424 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:24:38.738393 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3adf594_1955_4674_937f_634bf4825316.slice/crio-2577eae068b74962e27efc4bddd9881d2f6a7ec0289e9ec17066165a2584a3b9 WatchSource:0}: Error finding container 2577eae068b74962e27efc4bddd9881d2f6a7ec0289e9ec17066165a2584a3b9: Status 404 returned error can't find the container with id 2577eae068b74962e27efc4bddd9881d2f6a7ec0289e9ec17066165a2584a3b9 Apr 16 20:24:39.689865 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:39.689831 2572 generic.go:358] "Generic (PLEG): container finished" podID="c3adf594-1955-4674-937f-634bf4825316" containerID="0963ba59bb7ed251f60d7d7cd37378270e8ff1e433d778585c4feefe695ef698" exitCode=0 Apr 16 20:24:39.690221 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:39.689908 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" event={"ID":"c3adf594-1955-4674-937f-634bf4825316","Type":"ContainerDied","Data":"0963ba59bb7ed251f60d7d7cd37378270e8ff1e433d778585c4feefe695ef698"} Apr 16 20:24:39.690221 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:39.689957 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" event={"ID":"c3adf594-1955-4674-937f-634bf4825316","Type":"ContainerStarted","Data":"2577eae068b74962e27efc4bddd9881d2f6a7ec0289e9ec17066165a2584a3b9"} Apr 16 20:24:41.700923 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:41.700839 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" event={"ID":"c3adf594-1955-4674-937f-634bf4825316","Type":"ContainerStarted","Data":"b9245e9074db808b938ba6fbf449dd3a1ecb85dec10411428bff28d49877901b"} Apr 16 20:24:43.713026 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:43.712990 2572 generic.go:358] "Generic (PLEG): container finished" podID="b39614b3-442e-422a-8819-a35b537cdcb8" containerID="df53dce03c514fafca3eb168a77c32ebb2a72b01f151f18a4753e9369a3e5864" exitCode=0 Apr 16 20:24:43.713468 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:43.713084 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" event={"ID":"b39614b3-442e-422a-8819-a35b537cdcb8","Type":"ContainerDied","Data":"df53dce03c514fafca3eb168a77c32ebb2a72b01f151f18a4753e9369a3e5864"} Apr 16 20:24:44.718527 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:44.718491 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" event={"ID":"b39614b3-442e-422a-8819-a35b537cdcb8","Type":"ContainerStarted","Data":"a4269f9a36e3b64611b89babc4d76cb81d8a1da1b5142d186105319331035555"} Apr 16 20:24:44.738507 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:44.738444 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" podStartSLOduration=6.738425998 podStartE2EDuration="6.738425998s" podCreationTimestamp="2026-04-16 20:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:24:44.736823974 +0000 UTC m=+776.173812483" watchObservedRunningTime="2026-04-16 20:24:44.738425998 +0000 UTC m=+776.175414507" Apr 16 20:24:48.337719 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:48.337681 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:48.338299 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:48.337729 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:48.355838 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:48.355808 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:24:48.747301 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:24:48.747274 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:25:11.821247 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:11.821207 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" event={"ID":"c3adf594-1955-4674-937f-634bf4825316","Type":"ContainerStarted","Data":"cb090c522f0d26d020f2436237737ed8c8100eaeb5565a600316ebfdff58dd5c"} Apr 16 20:25:11.821653 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:11.821536 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:25:11.824505 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:11.824465 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 20:25:11.842307 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:11.842260 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" podStartSLOduration=2.693126656 podStartE2EDuration="33.842246869s" podCreationTimestamp="2026-04-16 20:24:38 +0000 UTC" firstStartedPulling="2026-04-16 20:24:39.69135362 +0000 UTC m=+771.128342113" lastFinishedPulling="2026-04-16 20:25:10.840473838 +0000 UTC m=+802.277462326" observedRunningTime="2026-04-16 20:25:11.841023714 +0000 UTC m=+803.278012221" watchObservedRunningTime="2026-04-16 20:25:11.842246869 +0000 UTC m=+803.279235379" Apr 16 20:25:12.826316 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:12.826277 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 20:25:18.598601 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:18.598559 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:25:18.598601 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:18.598604 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:25:18.600166 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:18.600130 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 20:25:18.600282 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:18.600196 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:25:18.848362 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:18.848324 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 20:25:18.848539 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:18.848390 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:25:19.851732 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:19.851697 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 20:25:29.852264 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:29.852223 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 20:25:32.761266 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:32.761231 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz"] Apr 16 20:25:32.761643 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:32.761529 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" podUID="b39614b3-442e-422a-8819-a35b537cdcb8" containerName="main" containerID="cri-o://a4269f9a36e3b64611b89babc4d76cb81d8a1da1b5142d186105319331035555" gracePeriod=30 Apr 16 20:25:32.768236 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:32.768209 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj"] Apr 16 20:25:32.768540 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:32.768514 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="main" containerID="cri-o://b9245e9074db808b938ba6fbf449dd3a1ecb85dec10411428bff28d49877901b" gracePeriod=30 Apr 16 20:25:32.768616 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:32.768549 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="tokenizer" containerID="cri-o://cb090c522f0d26d020f2436237737ed8c8100eaeb5565a600316ebfdff58dd5c" gracePeriod=30 Apr 16 20:25:32.770100 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:32.770049 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 20:25:32.895013 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:32.894971 2572 generic.go:358] "Generic (PLEG): container finished" podID="c3adf594-1955-4674-937f-634bf4825316" containerID="b9245e9074db808b938ba6fbf449dd3a1ecb85dec10411428bff28d49877901b" exitCode=0 Apr 16 20:25:32.895153 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:32.895030 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" event={"ID":"c3adf594-1955-4674-937f-634bf4825316","Type":"ContainerDied","Data":"b9245e9074db808b938ba6fbf449dd3a1ecb85dec10411428bff28d49877901b"} Apr 16 20:25:32.896846 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:32.896821 2572 generic.go:358] "Generic (PLEG): container finished" podID="b39614b3-442e-422a-8819-a35b537cdcb8" containerID="a4269f9a36e3b64611b89babc4d76cb81d8a1da1b5142d186105319331035555" exitCode=0 Apr 16 20:25:32.896974 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:32.896893 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" event={"ID":"b39614b3-442e-422a-8819-a35b537cdcb8","Type":"ContainerDied","Data":"a4269f9a36e3b64611b89babc4d76cb81d8a1da1b5142d186105319331035555"} Apr 16 20:25:33.013503 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.013443 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:25:33.117464 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.117428 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-home\") pod \"b39614b3-442e-422a-8819-a35b537cdcb8\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " Apr 16 20:25:33.117464 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.117471 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b39614b3-442e-422a-8819-a35b537cdcb8-tls-certs\") pod \"b39614b3-442e-422a-8819-a35b537cdcb8\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " Apr 16 20:25:33.117708 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.117528 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-kserve-provision-location\") pod \"b39614b3-442e-422a-8819-a35b537cdcb8\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " Apr 16 20:25:33.117708 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.117587 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-model-cache\") pod \"b39614b3-442e-422a-8819-a35b537cdcb8\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " Apr 16 20:25:33.117708 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.117610 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kcqx\" (UniqueName: \"kubernetes.io/projected/b39614b3-442e-422a-8819-a35b537cdcb8-kube-api-access-7kcqx\") pod \"b39614b3-442e-422a-8819-a35b537cdcb8\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " Apr 16 20:25:33.117708 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.117651 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-dshm\") pod \"b39614b3-442e-422a-8819-a35b537cdcb8\" (UID: \"b39614b3-442e-422a-8819-a35b537cdcb8\") " Apr 16 20:25:33.117708 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.117683 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-home" (OuterVolumeSpecName: "home") pod "b39614b3-442e-422a-8819-a35b537cdcb8" (UID: "b39614b3-442e-422a-8819-a35b537cdcb8"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:33.117993 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.117820 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-model-cache" (OuterVolumeSpecName: "model-cache") pod "b39614b3-442e-422a-8819-a35b537cdcb8" (UID: "b39614b3-442e-422a-8819-a35b537cdcb8"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:33.117993 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.117962 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-home\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:25:33.117993 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.117982 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-model-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:25:33.120415 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.120033 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-dshm" (OuterVolumeSpecName: "dshm") pod "b39614b3-442e-422a-8819-a35b537cdcb8" (UID: "b39614b3-442e-422a-8819-a35b537cdcb8"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:33.120415 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.120132 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39614b3-442e-422a-8819-a35b537cdcb8-kube-api-access-7kcqx" (OuterVolumeSpecName: "kube-api-access-7kcqx") pod "b39614b3-442e-422a-8819-a35b537cdcb8" (UID: "b39614b3-442e-422a-8819-a35b537cdcb8"). InnerVolumeSpecName "kube-api-access-7kcqx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:25:33.121007 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.120979 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b39614b3-442e-422a-8819-a35b537cdcb8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b39614b3-442e-422a-8819-a35b537cdcb8" (UID: "b39614b3-442e-422a-8819-a35b537cdcb8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:25:33.179396 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.179352 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b39614b3-442e-422a-8819-a35b537cdcb8" (UID: "b39614b3-442e-422a-8819-a35b537cdcb8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:33.219062 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.219020 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-dshm\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:25:33.219062 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.219055 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b39614b3-442e-422a-8819-a35b537cdcb8-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:25:33.219062 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.219069 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b39614b3-442e-422a-8819-a35b537cdcb8-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:25:33.219300 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.219082 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7kcqx\" (UniqueName: \"kubernetes.io/projected/b39614b3-442e-422a-8819-a35b537cdcb8-kube-api-access-7kcqx\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:25:33.901992 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.901964 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" Apr 16 20:25:33.901992 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.901971 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz" event={"ID":"b39614b3-442e-422a-8819-a35b537cdcb8","Type":"ContainerDied","Data":"b1cc0ffda0a04449de0af664aad2e354472260ec3fc08ed5b66b41c953128181"} Apr 16 20:25:33.902475 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.902024 2572 scope.go:117] "RemoveContainer" containerID="a4269f9a36e3b64611b89babc4d76cb81d8a1da1b5142d186105319331035555" Apr 16 20:25:33.904172 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.904150 2572 generic.go:358] "Generic (PLEG): container finished" podID="c3adf594-1955-4674-937f-634bf4825316" containerID="cb090c522f0d26d020f2436237737ed8c8100eaeb5565a600316ebfdff58dd5c" exitCode=0 Apr 16 20:25:33.904254 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.904193 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" event={"ID":"c3adf594-1955-4674-937f-634bf4825316","Type":"ContainerDied","Data":"cb090c522f0d26d020f2436237737ed8c8100eaeb5565a600316ebfdff58dd5c"} Apr 16 20:25:33.912688 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.912667 2572 scope.go:117] "RemoveContainer" containerID="df53dce03c514fafca3eb168a77c32ebb2a72b01f151f18a4753e9369a3e5864" Apr 16 20:25:33.927231 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.927206 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz"] Apr 16 20:25:33.931741 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:33.931716 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7d456978c8-5k8jz"] Apr 16 20:25:34.014979 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.014955 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:25:34.125545 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.125469 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3adf594-1955-4674-937f-634bf4825316-tls-certs\") pod \"c3adf594-1955-4674-937f-634bf4825316\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " Apr 16 20:25:34.125545 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.125543 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-cache\") pod \"c3adf594-1955-4674-937f-634bf4825316\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " Apr 16 20:25:34.125716 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.125571 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-uds\") pod \"c3adf594-1955-4674-937f-634bf4825316\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " Apr 16 20:25:34.125716 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.125676 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-kserve-provision-location\") pod \"c3adf594-1955-4674-937f-634bf4825316\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " Apr 16 20:25:34.125793 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.125737 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88qxg\" (UniqueName: \"kubernetes.io/projected/c3adf594-1955-4674-937f-634bf4825316-kube-api-access-88qxg\") pod \"c3adf594-1955-4674-937f-634bf4825316\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " Apr 16 20:25:34.125793 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.125780 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-tmp\") pod \"c3adf594-1955-4674-937f-634bf4825316\" (UID: \"c3adf594-1955-4674-937f-634bf4825316\") " Apr 16 20:25:34.125929 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.125826 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c3adf594-1955-4674-937f-634bf4825316" (UID: "c3adf594-1955-4674-937f-634bf4825316"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:34.125929 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.125859 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c3adf594-1955-4674-937f-634bf4825316" (UID: "c3adf594-1955-4674-937f-634bf4825316"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:34.126106 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.126083 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:25:34.126184 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.126108 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-uds\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:25:34.126227 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.126178 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c3adf594-1955-4674-937f-634bf4825316" (UID: "c3adf594-1955-4674-937f-634bf4825316"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:34.126404 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.126387 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c3adf594-1955-4674-937f-634bf4825316" (UID: "c3adf594-1955-4674-937f-634bf4825316"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:34.127746 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.127722 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3adf594-1955-4674-937f-634bf4825316-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c3adf594-1955-4674-937f-634bf4825316" (UID: "c3adf594-1955-4674-937f-634bf4825316"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:25:34.127851 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.127820 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3adf594-1955-4674-937f-634bf4825316-kube-api-access-88qxg" (OuterVolumeSpecName: "kube-api-access-88qxg") pod "c3adf594-1955-4674-937f-634bf4825316" (UID: "c3adf594-1955-4674-937f-634bf4825316"). InnerVolumeSpecName "kube-api-access-88qxg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:25:34.227085 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.227051 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:25:34.227085 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.227078 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-88qxg\" (UniqueName: \"kubernetes.io/projected/c3adf594-1955-4674-937f-634bf4825316-kube-api-access-88qxg\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:25:34.227085 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.227088 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c3adf594-1955-4674-937f-634bf4825316-tokenizer-tmp\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:25:34.227289 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.227098 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3adf594-1955-4674-937f-634bf4825316-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:25:34.909601 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.909574 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" Apr 16 20:25:34.910094 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.909573 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj" event={"ID":"c3adf594-1955-4674-937f-634bf4825316","Type":"ContainerDied","Data":"2577eae068b74962e27efc4bddd9881d2f6a7ec0289e9ec17066165a2584a3b9"} Apr 16 20:25:34.910094 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.909724 2572 scope.go:117] "RemoveContainer" containerID="cb090c522f0d26d020f2436237737ed8c8100eaeb5565a600316ebfdff58dd5c" Apr 16 20:25:34.918474 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.918457 2572 scope.go:117] "RemoveContainer" containerID="b9245e9074db808b938ba6fbf449dd3a1ecb85dec10411428bff28d49877901b" Apr 16 20:25:34.931480 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.931455 2572 scope.go:117] "RemoveContainer" containerID="0963ba59bb7ed251f60d7d7cd37378270e8ff1e433d778585c4feefe695ef698" Apr 16 20:25:34.932371 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.932338 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj"] Apr 16 20:25:34.937274 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:34.937252 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-659d96cwrrzj"] Apr 16 20:25:35.149735 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.149691 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b39614b3-442e-422a-8819-a35b537cdcb8" path="/var/lib/kubelet/pods/b39614b3-442e-422a-8819-a35b537cdcb8/volumes" Apr 16 20:25:35.150167 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.150154 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3adf594-1955-4674-937f-634bf4825316" path="/var/lib/kubelet/pods/c3adf594-1955-4674-937f-634bf4825316/volumes" Apr 16 20:25:35.808659 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.808618 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97"] Apr 16 20:25:35.809041 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.809025 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b39614b3-442e-422a-8819-a35b537cdcb8" containerName="main" Apr 16 20:25:35.809041 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.809041 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39614b3-442e-422a-8819-a35b537cdcb8" containerName="main" Apr 16 20:25:35.809204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.809053 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="tokenizer" Apr 16 20:25:35.809204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.809058 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="tokenizer" Apr 16 20:25:35.809204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.809067 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b39614b3-442e-422a-8819-a35b537cdcb8" containerName="storage-initializer" Apr 16 20:25:35.809204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.809076 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39614b3-442e-422a-8819-a35b537cdcb8" containerName="storage-initializer" Apr 16 20:25:35.809204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.809095 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="storage-initializer" Apr 16 20:25:35.809204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.809101 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="storage-initializer" Apr 16 20:25:35.809204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.809107 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="main" Apr 16 20:25:35.809204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.809113 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="main" Apr 16 20:25:35.809204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.809168 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="main" Apr 16 20:25:35.809204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.809180 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b39614b3-442e-422a-8819-a35b537cdcb8" containerName="main" Apr 16 20:25:35.809204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.809189 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3adf594-1955-4674-937f-634bf4825316" containerName="tokenizer" Apr 16 20:25:35.814281 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.814255 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:35.816743 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.816708 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:25:35.816888 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.816842 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:25:35.817678 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.817656 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:25:35.817777 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.817721 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 20:25:35.822783 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.822757 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97"] Apr 16 20:25:35.942134 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.942090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:35.942134 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.942135 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-tls-certs\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:35.942566 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.942222 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-dshm\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:35.942566 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.942303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-model-cache\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:35.942566 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.942327 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clpng\" (UniqueName: \"kubernetes.io/projected/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-kube-api-access-clpng\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:35.942566 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:35.942371 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-home\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.043610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.043579 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.043795 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.043624 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-tls-certs\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.043795 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.043671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-dshm\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.043795 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.043731 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-model-cache\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.043948 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.043852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clpng\" (UniqueName: \"kubernetes.io/projected/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-kube-api-access-clpng\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.043948 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.043919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-home\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.044073 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.044050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.044132 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.044104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-model-cache\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.044291 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.044268 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-home\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.046009 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.045987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-dshm\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.046164 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.046149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-tls-certs\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.051600 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.051579 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clpng\" (UniqueName: \"kubernetes.io/projected/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-kube-api-access-clpng\") pod \"precise-prefix-cache-test-kserve-6855fc94cd-hjq97\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.124854 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.124768 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t"] Apr 16 20:25:36.126594 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.126562 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:36.129217 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.129195 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.131693 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.131671 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-jfbmk\"" Apr 16 20:25:36.138628 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.138601 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t"] Apr 16 20:25:36.144738 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.144713 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.144738 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.144744 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.144985 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.144782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.144985 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.144888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6c990bfd-fc63-4765-b77c-67a8258b026c-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.144985 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.144928 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.144985 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.144976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skhn9\" (UniqueName: \"kubernetes.io/projected/6c990bfd-fc63-4765-b77c-67a8258b026c-kube-api-access-skhn9\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.246325 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.246296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skhn9\" (UniqueName: \"kubernetes.io/projected/6c990bfd-fc63-4765-b77c-67a8258b026c-kube-api-access-skhn9\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.246475 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.246348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.246475 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.246368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.246475 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.246391 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.246475 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.246431 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6c990bfd-fc63-4765-b77c-67a8258b026c-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.246475 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.246449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.246757 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.246732 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.246820 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.246767 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.246820 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.246798 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.246820 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.246811 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.249119 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.249095 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6c990bfd-fc63-4765-b77c-67a8258b026c-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.254063 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.254038 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97"] Apr 16 20:25:36.256190 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.256166 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skhn9\" (UniqueName: \"kubernetes.io/projected/6c990bfd-fc63-4765-b77c-67a8258b026c-kube-api-access-skhn9\") pod \"precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.256973 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:25:36.256936 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ccbef4b_6f34_4ab3_aae4_01d0f8cbac2e.slice/crio-7a7d3028d709d9932c844a7b119de7ac47cd154a4dac2854c0753a5695f0020f WatchSource:0}: Error finding container 7a7d3028d709d9932c844a7b119de7ac47cd154a4dac2854c0753a5695f0020f: Status 404 returned error can't find the container with id 7a7d3028d709d9932c844a7b119de7ac47cd154a4dac2854c0753a5695f0020f Apr 16 20:25:36.460841 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.460745 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:36.594791 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.594759 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t"] Apr 16 20:25:36.597120 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:25:36.597089 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c990bfd_fc63_4765_b77c_67a8258b026c.slice/crio-6c7f041f38d7667df530e53e25ba39f30dc546bb87e0301339b1fee9f083c9c8 WatchSource:0}: Error finding container 6c7f041f38d7667df530e53e25ba39f30dc546bb87e0301339b1fee9f083c9c8: Status 404 returned error can't find the container with id 6c7f041f38d7667df530e53e25ba39f30dc546bb87e0301339b1fee9f083c9c8 Apr 16 20:25:36.921234 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.921194 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" event={"ID":"6c990bfd-fc63-4765-b77c-67a8258b026c","Type":"ContainerStarted","Data":"1e248dd553612ad98afb82131713955a5efe43657859c824479ef993b84fd7de"} Apr 16 20:25:36.921234 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.921235 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" event={"ID":"6c990bfd-fc63-4765-b77c-67a8258b026c","Type":"ContainerStarted","Data":"6c7f041f38d7667df530e53e25ba39f30dc546bb87e0301339b1fee9f083c9c8"} Apr 16 20:25:36.922661 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.922632 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" event={"ID":"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e","Type":"ContainerStarted","Data":"ff4a073080621b1277d72f7574dae504faea72011941c8d60b92d2e45455d897"} Apr 16 20:25:36.922774 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:36.922665 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" event={"ID":"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e","Type":"ContainerStarted","Data":"7a7d3028d709d9932c844a7b119de7ac47cd154a4dac2854c0753a5695f0020f"} Apr 16 20:25:37.927478 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:37.927438 2572 generic.go:358] "Generic (PLEG): container finished" podID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerID="1e248dd553612ad98afb82131713955a5efe43657859c824479ef993b84fd7de" exitCode=0 Apr 16 20:25:37.928063 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:37.927528 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" event={"ID":"6c990bfd-fc63-4765-b77c-67a8258b026c","Type":"ContainerDied","Data":"1e248dd553612ad98afb82131713955a5efe43657859c824479ef993b84fd7de"} Apr 16 20:25:38.934702 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:38.934655 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" event={"ID":"6c990bfd-fc63-4765-b77c-67a8258b026c","Type":"ContainerStarted","Data":"9cd9a4b8bfd05247118249d525850d8978a3ae6dedfd40c5430692df5c021b69"} Apr 16 20:25:38.934702 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:38.934698 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" event={"ID":"6c990bfd-fc63-4765-b77c-67a8258b026c","Type":"ContainerStarted","Data":"eaad64231d00c30c8eee0c05ab9385febd9252c64547d031376c1509dde8ff06"} Apr 16 20:25:38.935208 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:38.934848 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:38.955412 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:38.955358 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" podStartSLOduration=2.955342339 podStartE2EDuration="2.955342339s" podCreationTimestamp="2026-04-16 20:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:25:38.953455561 +0000 UTC m=+830.390444072" watchObservedRunningTime="2026-04-16 20:25:38.955342339 +0000 UTC m=+830.392330848" Apr 16 20:25:40.944313 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:40.944278 2572 generic.go:358] "Generic (PLEG): container finished" podID="9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" containerID="ff4a073080621b1277d72f7574dae504faea72011941c8d60b92d2e45455d897" exitCode=0 Apr 16 20:25:40.944717 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:40.944363 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" event={"ID":"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e","Type":"ContainerDied","Data":"ff4a073080621b1277d72f7574dae504faea72011941c8d60b92d2e45455d897"} Apr 16 20:25:41.950615 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:41.950581 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" event={"ID":"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e","Type":"ContainerStarted","Data":"4e1fda105ad605a99e143cb976aa98dd8799fcbf1e815a0a0d107c7a60f24b49"} Apr 16 20:25:41.970116 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:41.970060 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" podStartSLOduration=6.970040678 podStartE2EDuration="6.970040678s" podCreationTimestamp="2026-04-16 20:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:25:41.967965242 +0000 UTC m=+833.404953763" watchObservedRunningTime="2026-04-16 20:25:41.970040678 +0000 UTC m=+833.407029190" Apr 16 20:25:46.127444 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:46.127409 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:46.127444 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:46.127446 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:46.139719 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:46.139681 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:46.461644 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:46.461546 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:46.461644 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:46.461593 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:46.462906 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:25:46.462858 2572 logging.go:55] [core] [Channel #44 SubChannel #45]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.37:9003", ServerName: "10.134.0.37:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.37:9003: connect: connection refused" Apr 16 20:25:46.464207 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:46.464186 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:46.969768 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:46.969741 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:25:46.979973 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:46.979948 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:25:47.462554 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:47.462509 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" podUID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.37:9003\" within 1s: context deadline exceeded" Apr 16 20:25:56.461629 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:25:56.461596 2572 logging.go:55] [core] [Channel #46 SubChannel #47]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.37:9003", ServerName: "10.134.0.37:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.37:9003: connect: connection refused" Apr 16 20:25:57.462250 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:25:57.462205 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" podUID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.37:9003\" within 1s: context deadline exceeded" Apr 16 20:26:08.977243 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:08.977214 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:26:10.088566 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.088517 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97"] Apr 16 20:26:10.088969 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.088795 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" podUID="9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" containerName="main" containerID="cri-o://4e1fda105ad605a99e143cb976aa98dd8799fcbf1e815a0a0d107c7a60f24b49" gracePeriod=30 Apr 16 20:26:10.098647 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.098617 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t"] Apr 16 20:26:10.099021 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.098992 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" podUID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerName="main" containerID="cri-o://eaad64231d00c30c8eee0c05ab9385febd9252c64547d031376c1509dde8ff06" gracePeriod=30 Apr 16 20:26:10.099119 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.099053 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" podUID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerName="tokenizer" containerID="cri-o://9cd9a4b8bfd05247118249d525850d8978a3ae6dedfd40c5430692df5c021b69" gracePeriod=30 Apr 16 20:26:10.345930 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.345843 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:26:10.463465 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.463429 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-kserve-provision-location\") pod \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " Apr 16 20:26:10.463648 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.463489 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-home\") pod \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " Apr 16 20:26:10.463648 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.463508 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-dshm\") pod \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " Apr 16 20:26:10.463648 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.463548 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-model-cache\") pod \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " Apr 16 20:26:10.463648 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.463570 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-tls-certs\") pod \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " Apr 16 20:26:10.463648 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.463624 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clpng\" (UniqueName: \"kubernetes.io/projected/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-kube-api-access-clpng\") pod \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\" (UID: \"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e\") " Apr 16 20:26:10.463968 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.463884 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-home" (OuterVolumeSpecName: "home") pod "9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" (UID: "9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:10.463968 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.463939 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-model-cache" (OuterVolumeSpecName: "model-cache") pod "9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" (UID: "9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:10.464072 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.463953 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-home\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:26:10.465841 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.465809 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" (UID: "9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:26:10.465841 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.465813 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-dshm" (OuterVolumeSpecName: "dshm") pod "9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" (UID: "9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:10.466232 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.466211 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-kube-api-access-clpng" (OuterVolumeSpecName: "kube-api-access-clpng") pod "9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" (UID: "9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e"). InnerVolumeSpecName "kube-api-access-clpng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:26:10.519440 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.519397 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" (UID: "9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:10.564988 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.564953 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-dshm\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:26:10.564988 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.564981 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-model-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:26:10.564988 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.564991 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:26:10.565210 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.565002 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-clpng\" (UniqueName: \"kubernetes.io/projected/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-kube-api-access-clpng\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:26:10.565210 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:10.565011 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:26:11.052730 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.052697 2572 generic.go:358] "Generic (PLEG): container finished" podID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerID="eaad64231d00c30c8eee0c05ab9385febd9252c64547d031376c1509dde8ff06" exitCode=0 Apr 16 20:26:11.052959 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.052766 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" event={"ID":"6c990bfd-fc63-4765-b77c-67a8258b026c","Type":"ContainerDied","Data":"eaad64231d00c30c8eee0c05ab9385febd9252c64547d031376c1509dde8ff06"} Apr 16 20:26:11.054241 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.054214 2572 generic.go:358] "Generic (PLEG): container finished" podID="9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" containerID="4e1fda105ad605a99e143cb976aa98dd8799fcbf1e815a0a0d107c7a60f24b49" exitCode=0 Apr 16 20:26:11.054389 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.054282 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" Apr 16 20:26:11.054389 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.054302 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" event={"ID":"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e","Type":"ContainerDied","Data":"4e1fda105ad605a99e143cb976aa98dd8799fcbf1e815a0a0d107c7a60f24b49"} Apr 16 20:26:11.054389 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.054342 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97" event={"ID":"9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e","Type":"ContainerDied","Data":"7a7d3028d709d9932c844a7b119de7ac47cd154a4dac2854c0753a5695f0020f"} Apr 16 20:26:11.054389 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.054365 2572 scope.go:117] "RemoveContainer" containerID="4e1fda105ad605a99e143cb976aa98dd8799fcbf1e815a0a0d107c7a60f24b49" Apr 16 20:26:11.064288 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.064270 2572 scope.go:117] "RemoveContainer" containerID="ff4a073080621b1277d72f7574dae504faea72011941c8d60b92d2e45455d897" Apr 16 20:26:11.078051 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.078023 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97"] Apr 16 20:26:11.081155 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.081132 2572 scope.go:117] "RemoveContainer" containerID="4e1fda105ad605a99e143cb976aa98dd8799fcbf1e815a0a0d107c7a60f24b49" Apr 16 20:26:11.081597 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:26:11.081576 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e1fda105ad605a99e143cb976aa98dd8799fcbf1e815a0a0d107c7a60f24b49\": container with ID starting with 4e1fda105ad605a99e143cb976aa98dd8799fcbf1e815a0a0d107c7a60f24b49 not found: ID does not exist" containerID="4e1fda105ad605a99e143cb976aa98dd8799fcbf1e815a0a0d107c7a60f24b49" Apr 16 20:26:11.081696 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.081606 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e1fda105ad605a99e143cb976aa98dd8799fcbf1e815a0a0d107c7a60f24b49"} err="failed to get container status \"4e1fda105ad605a99e143cb976aa98dd8799fcbf1e815a0a0d107c7a60f24b49\": rpc error: code = NotFound desc = could not find container \"4e1fda105ad605a99e143cb976aa98dd8799fcbf1e815a0a0d107c7a60f24b49\": container with ID starting with 4e1fda105ad605a99e143cb976aa98dd8799fcbf1e815a0a0d107c7a60f24b49 not found: ID does not exist" Apr 16 20:26:11.081696 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.081624 2572 scope.go:117] "RemoveContainer" containerID="ff4a073080621b1277d72f7574dae504faea72011941c8d60b92d2e45455d897" Apr 16 20:26:11.081928 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:26:11.081908 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4a073080621b1277d72f7574dae504faea72011941c8d60b92d2e45455d897\": container with ID starting with ff4a073080621b1277d72f7574dae504faea72011941c8d60b92d2e45455d897 not found: ID does not exist" containerID="ff4a073080621b1277d72f7574dae504faea72011941c8d60b92d2e45455d897" Apr 16 20:26:11.081992 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.081937 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4a073080621b1277d72f7574dae504faea72011941c8d60b92d2e45455d897"} err="failed to get container status \"ff4a073080621b1277d72f7574dae504faea72011941c8d60b92d2e45455d897\": rpc error: code = NotFound desc = could not find container \"ff4a073080621b1277d72f7574dae504faea72011941c8d60b92d2e45455d897\": container with ID starting with ff4a073080621b1277d72f7574dae504faea72011941c8d60b92d2e45455d897 not found: ID does not exist" Apr 16 20:26:11.082166 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.082147 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6855fc94cd-hjq97"] Apr 16 20:26:11.149702 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.149667 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" path="/var/lib/kubelet/pods/9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e/volumes" Apr 16 20:26:11.645568 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.645541 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:26:11.779605 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.779569 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-cache\") pod \"6c990bfd-fc63-4765-b77c-67a8258b026c\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " Apr 16 20:26:11.779793 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.779618 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-uds\") pod \"6c990bfd-fc63-4765-b77c-67a8258b026c\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " Apr 16 20:26:11.779793 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.779651 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-kserve-provision-location\") pod \"6c990bfd-fc63-4765-b77c-67a8258b026c\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " Apr 16 20:26:11.779793 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.779670 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6c990bfd-fc63-4765-b77c-67a8258b026c-tls-certs\") pod \"6c990bfd-fc63-4765-b77c-67a8258b026c\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " Apr 16 20:26:11.779793 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.779692 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skhn9\" (UniqueName: \"kubernetes.io/projected/6c990bfd-fc63-4765-b77c-67a8258b026c-kube-api-access-skhn9\") pod \"6c990bfd-fc63-4765-b77c-67a8258b026c\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " Apr 16 20:26:11.779793 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.779719 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-tmp\") pod \"6c990bfd-fc63-4765-b77c-67a8258b026c\" (UID: \"6c990bfd-fc63-4765-b77c-67a8258b026c\") " Apr 16 20:26:11.780093 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.779958 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6c990bfd-fc63-4765-b77c-67a8258b026c" (UID: "6c990bfd-fc63-4765-b77c-67a8258b026c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:11.780093 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.780013 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6c990bfd-fc63-4765-b77c-67a8258b026c" (UID: "6c990bfd-fc63-4765-b77c-67a8258b026c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:11.780193 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.780156 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6c990bfd-fc63-4765-b77c-67a8258b026c" (UID: "6c990bfd-fc63-4765-b77c-67a8258b026c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:11.780462 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.780439 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6c990bfd-fc63-4765-b77c-67a8258b026c" (UID: "6c990bfd-fc63-4765-b77c-67a8258b026c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:11.781850 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.781830 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c990bfd-fc63-4765-b77c-67a8258b026c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6c990bfd-fc63-4765-b77c-67a8258b026c" (UID: "6c990bfd-fc63-4765-b77c-67a8258b026c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:26:11.782257 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.782229 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c990bfd-fc63-4765-b77c-67a8258b026c-kube-api-access-skhn9" (OuterVolumeSpecName: "kube-api-access-skhn9") pod "6c990bfd-fc63-4765-b77c-67a8258b026c" (UID: "6c990bfd-fc63-4765-b77c-67a8258b026c"). InnerVolumeSpecName "kube-api-access-skhn9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:26:11.881084 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.881055 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:26:11.881084 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.881079 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-uds\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:26:11.881084 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.881090 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:26:11.881281 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.881102 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6c990bfd-fc63-4765-b77c-67a8258b026c-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:26:11.881281 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.881113 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-skhn9\" (UniqueName: \"kubernetes.io/projected/6c990bfd-fc63-4765-b77c-67a8258b026c-kube-api-access-skhn9\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:26:11.881281 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:11.881122 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c990bfd-fc63-4765-b77c-67a8258b026c-tokenizer-tmp\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:26:12.059192 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.059106 2572 generic.go:358] "Generic (PLEG): container finished" podID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerID="9cd9a4b8bfd05247118249d525850d8978a3ae6dedfd40c5430692df5c021b69" exitCode=0 Apr 16 20:26:12.059192 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.059179 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" Apr 16 20:26:12.059427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.059192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" event={"ID":"6c990bfd-fc63-4765-b77c-67a8258b026c","Type":"ContainerDied","Data":"9cd9a4b8bfd05247118249d525850d8978a3ae6dedfd40c5430692df5c021b69"} Apr 16 20:26:12.059427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.059228 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t" event={"ID":"6c990bfd-fc63-4765-b77c-67a8258b026c","Type":"ContainerDied","Data":"6c7f041f38d7667df530e53e25ba39f30dc546bb87e0301339b1fee9f083c9c8"} Apr 16 20:26:12.059427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.059246 2572 scope.go:117] "RemoveContainer" containerID="9cd9a4b8bfd05247118249d525850d8978a3ae6dedfd40c5430692df5c021b69" Apr 16 20:26:12.067430 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.067413 2572 scope.go:117] "RemoveContainer" containerID="eaad64231d00c30c8eee0c05ab9385febd9252c64547d031376c1509dde8ff06" Apr 16 20:26:12.074938 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.074904 2572 scope.go:117] "RemoveContainer" containerID="1e248dd553612ad98afb82131713955a5efe43657859c824479ef993b84fd7de" Apr 16 20:26:12.079803 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.079776 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t"] Apr 16 20:26:12.084257 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.084208 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-8b796954fhb5t"] Apr 16 20:26:12.086646 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.086622 2572 scope.go:117] "RemoveContainer" containerID="9cd9a4b8bfd05247118249d525850d8978a3ae6dedfd40c5430692df5c021b69" Apr 16 20:26:12.086921 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:26:12.086901 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd9a4b8bfd05247118249d525850d8978a3ae6dedfd40c5430692df5c021b69\": container with ID starting with 9cd9a4b8bfd05247118249d525850d8978a3ae6dedfd40c5430692df5c021b69 not found: ID does not exist" containerID="9cd9a4b8bfd05247118249d525850d8978a3ae6dedfd40c5430692df5c021b69" Apr 16 20:26:12.087002 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.086934 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd9a4b8bfd05247118249d525850d8978a3ae6dedfd40c5430692df5c021b69"} err="failed to get container status \"9cd9a4b8bfd05247118249d525850d8978a3ae6dedfd40c5430692df5c021b69\": rpc error: code = NotFound desc = could not find container \"9cd9a4b8bfd05247118249d525850d8978a3ae6dedfd40c5430692df5c021b69\": container with ID starting with 9cd9a4b8bfd05247118249d525850d8978a3ae6dedfd40c5430692df5c021b69 not found: ID does not exist" Apr 16 20:26:12.087002 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.086961 2572 scope.go:117] "RemoveContainer" containerID="eaad64231d00c30c8eee0c05ab9385febd9252c64547d031376c1509dde8ff06" Apr 16 20:26:12.087208 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:26:12.087190 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaad64231d00c30c8eee0c05ab9385febd9252c64547d031376c1509dde8ff06\": container with ID starting with eaad64231d00c30c8eee0c05ab9385febd9252c64547d031376c1509dde8ff06 not found: ID does not exist" containerID="eaad64231d00c30c8eee0c05ab9385febd9252c64547d031376c1509dde8ff06" Apr 16 20:26:12.087256 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.087214 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaad64231d00c30c8eee0c05ab9385febd9252c64547d031376c1509dde8ff06"} err="failed to get container status \"eaad64231d00c30c8eee0c05ab9385febd9252c64547d031376c1509dde8ff06\": rpc error: code = NotFound desc = could not find container \"eaad64231d00c30c8eee0c05ab9385febd9252c64547d031376c1509dde8ff06\": container with ID starting with eaad64231d00c30c8eee0c05ab9385febd9252c64547d031376c1509dde8ff06 not found: ID does not exist" Apr 16 20:26:12.087256 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.087230 2572 scope.go:117] "RemoveContainer" containerID="1e248dd553612ad98afb82131713955a5efe43657859c824479ef993b84fd7de" Apr 16 20:26:12.087446 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:26:12.087429 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e248dd553612ad98afb82131713955a5efe43657859c824479ef993b84fd7de\": container with ID starting with 1e248dd553612ad98afb82131713955a5efe43657859c824479ef993b84fd7de not found: ID does not exist" containerID="1e248dd553612ad98afb82131713955a5efe43657859c824479ef993b84fd7de" Apr 16 20:26:12.087490 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:12.087452 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e248dd553612ad98afb82131713955a5efe43657859c824479ef993b84fd7de"} err="failed to get container status \"1e248dd553612ad98afb82131713955a5efe43657859c824479ef993b84fd7de\": rpc error: code = NotFound desc = could not find container \"1e248dd553612ad98afb82131713955a5efe43657859c824479ef993b84fd7de\": container with ID starting with 1e248dd553612ad98afb82131713955a5efe43657859c824479ef993b84fd7de not found: ID does not exist" Apr 16 20:26:13.150072 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:13.150038 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c990bfd-fc63-4765-b77c-67a8258b026c" path="/var/lib/kubelet/pods/6c990bfd-fc63-4765-b77c-67a8258b026c/volumes" Apr 16 20:26:16.513066 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513033 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h"] Apr 16 20:26:16.513597 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513578 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" containerName="storage-initializer" Apr 16 20:26:16.513639 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513601 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" containerName="storage-initializer" Apr 16 20:26:16.513639 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513625 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerName="storage-initializer" Apr 16 20:26:16.513639 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513634 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerName="storage-initializer" Apr 16 20:26:16.513726 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513644 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerName="main" Apr 16 20:26:16.513726 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513653 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerName="main" Apr 16 20:26:16.513726 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513662 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerName="tokenizer" Apr 16 20:26:16.513726 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513671 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerName="tokenizer" Apr 16 20:26:16.513726 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513700 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" containerName="main" Apr 16 20:26:16.513726 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513708 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" containerName="main" Apr 16 20:26:16.513924 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513788 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ccbef4b-6f34-4ab3-aae4-01d0f8cbac2e" containerName="main" Apr 16 20:26:16.513924 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513801 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerName="tokenizer" Apr 16 20:26:16.513924 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.513816 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c990bfd-fc63-4765-b77c-67a8258b026c" containerName="main" Apr 16 20:26:16.518951 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.518928 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.521668 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.521642 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:26:16.522400 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.522170 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-round-trip-kserve-self-signed-certs\"" Apr 16 20:26:16.522400 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.522322 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:26:16.522400 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.522385 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:26:16.533666 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.533638 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h"] Apr 16 20:26:16.619223 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.619186 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-home\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.619223 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.619222 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-dshm\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.619408 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.619284 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-kserve-provision-location\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.619408 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.619323 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4e4324a4-5100-48bb-8c83-79191f4a4919-tls-certs\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.619408 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.619355 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-model-cache\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.619408 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.619381 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5qm\" (UniqueName: \"kubernetes.io/projected/4e4324a4-5100-48bb-8c83-79191f4a4919-kube-api-access-pw5qm\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.720054 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.720019 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5qm\" (UniqueName: \"kubernetes.io/projected/4e4324a4-5100-48bb-8c83-79191f4a4919-kube-api-access-pw5qm\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.720395 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.720117 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-home\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.720395 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.720143 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-dshm\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.720395 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.720192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-kserve-provision-location\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.720395 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.720226 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4e4324a4-5100-48bb-8c83-79191f4a4919-tls-certs\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.720395 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.720269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-model-cache\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.720686 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.720628 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-kserve-provision-location\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.720686 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.720667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-model-cache\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.720772 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.720692 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-home\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.722526 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.722501 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-dshm\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.722668 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.722653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4e4324a4-5100-48bb-8c83-79191f4a4919-tls-certs\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.732171 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.732147 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5qm\" (UniqueName: \"kubernetes.io/projected/4e4324a4-5100-48bb-8c83-79191f4a4919-kube-api-access-pw5qm\") pod \"conv-test-round-trip-kserve-5b5459ccc8-w596h\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.837060 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.836964 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:26:16.978169 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:16.978134 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h"] Apr 16 20:26:16.980515 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:26:16.980479 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e4324a4_5100_48bb_8c83_79191f4a4919.slice/crio-5ea4a615b0316b87598bc4e96b57dbd741de94b970066d12bfdc38c15e24e7c2 WatchSource:0}: Error finding container 5ea4a615b0316b87598bc4e96b57dbd741de94b970066d12bfdc38c15e24e7c2: Status 404 returned error can't find the container with id 5ea4a615b0316b87598bc4e96b57dbd741de94b970066d12bfdc38c15e24e7c2 Apr 16 20:26:17.080284 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:17.080226 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" event={"ID":"4e4324a4-5100-48bb-8c83-79191f4a4919","Type":"ContainerStarted","Data":"6d0f87520fdad22aa0ae25c645d3029f9cbd1741d0c011a90a63607d8e70d395"} Apr 16 20:26:17.080468 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:17.080289 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" event={"ID":"4e4324a4-5100-48bb-8c83-79191f4a4919","Type":"ContainerStarted","Data":"5ea4a615b0316b87598bc4e96b57dbd741de94b970066d12bfdc38c15e24e7c2"} Apr 16 20:26:22.100187 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:22.100150 2572 generic.go:358] "Generic (PLEG): container finished" podID="4e4324a4-5100-48bb-8c83-79191f4a4919" containerID="6d0f87520fdad22aa0ae25c645d3029f9cbd1741d0c011a90a63607d8e70d395" exitCode=0 Apr 16 20:26:22.100616 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:22.100222 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" event={"ID":"4e4324a4-5100-48bb-8c83-79191f4a4919","Type":"ContainerDied","Data":"6d0f87520fdad22aa0ae25c645d3029f9cbd1741d0c011a90a63607d8e70d395"} Apr 16 20:26:44.367589 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.367557 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq"] Apr 16 20:26:44.399107 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.399075 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq"] Apr 16 20:26:44.399256 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.399173 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.401681 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.401653 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 20:26:44.401800 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.401662 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-tkhfx\"" Apr 16 20:26:44.491152 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.491120 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.491307 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.491192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffpjw\" (UniqueName: \"kubernetes.io/projected/c132d908-8c99-4d84-ada6-0f8a65b31eff-kube-api-access-ffpjw\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.491307 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.491218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.491307 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.491252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.491307 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.491297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.491468 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.491315 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c132d908-8c99-4d84-ada6-0f8a65b31eff-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.592755 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.592714 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.592944 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.592812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffpjw\" (UniqueName: \"kubernetes.io/projected/c132d908-8c99-4d84-ada6-0f8a65b31eff-kube-api-access-ffpjw\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.592944 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.592845 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.592944 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.592924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.593132 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.592961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.593132 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.592991 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c132d908-8c99-4d84-ada6-0f8a65b31eff-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.593241 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.593199 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.593690 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.593402 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.593690 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.593675 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.593909 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.593739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.596131 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.596086 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c132d908-8c99-4d84-ada6-0f8a65b31eff-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.605736 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.605694 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffpjw\" (UniqueName: \"kubernetes.io/projected/c132d908-8c99-4d84-ada6-0f8a65b31eff-kube-api-access-ffpjw\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.709109 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.709029 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:44.847479 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:44.847404 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq"] Apr 16 20:26:44.848921 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:26:44.848856 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc132d908_8c99_4d84_ada6_0f8a65b31eff.slice/crio-5670b7377bac0bb7d999fe06bbc06435417ae776891036dae3edea16f2d90d0c WatchSource:0}: Error finding container 5670b7377bac0bb7d999fe06bbc06435417ae776891036dae3edea16f2d90d0c: Status 404 returned error can't find the container with id 5670b7377bac0bb7d999fe06bbc06435417ae776891036dae3edea16f2d90d0c Apr 16 20:26:45.189248 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:45.189207 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" event={"ID":"c132d908-8c99-4d84-ada6-0f8a65b31eff","Type":"ContainerStarted","Data":"cf4802435ede99055e918420cfe202f7d7ebf877da5827127191db8517712f03"} Apr 16 20:26:45.189248 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:45.189251 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" event={"ID":"c132d908-8c99-4d84-ada6-0f8a65b31eff","Type":"ContainerStarted","Data":"5670b7377bac0bb7d999fe06bbc06435417ae776891036dae3edea16f2d90d0c"} Apr 16 20:26:46.194260 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:46.194219 2572 generic.go:358] "Generic (PLEG): container finished" podID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerID="cf4802435ede99055e918420cfe202f7d7ebf877da5827127191db8517712f03" exitCode=0 Apr 16 20:26:46.194638 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:46.194304 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" event={"ID":"c132d908-8c99-4d84-ada6-0f8a65b31eff","Type":"ContainerDied","Data":"cf4802435ede99055e918420cfe202f7d7ebf877da5827127191db8517712f03"} Apr 16 20:26:47.200770 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:47.200737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" event={"ID":"c132d908-8c99-4d84-ada6-0f8a65b31eff","Type":"ContainerStarted","Data":"ef00335b0c36762efbcad7204dead832465d7d00c0d24271237b0a0c707ea692"} Apr 16 20:26:47.200770 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:47.200773 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" event={"ID":"c132d908-8c99-4d84-ada6-0f8a65b31eff","Type":"ContainerStarted","Data":"58fbae07aa6172ace3a97bf78630daae8846304f77aaa645844801d211da4900"} Apr 16 20:26:47.201196 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:47.200806 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:47.221333 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:47.221276 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" podStartSLOduration=3.221261542 podStartE2EDuration="3.221261542s" podCreationTimestamp="2026-04-16 20:26:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:26:47.220232865 +0000 UTC m=+898.657221412" watchObservedRunningTime="2026-04-16 20:26:47.221261542 +0000 UTC m=+898.658250050" Apr 16 20:26:47.301158 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:47.301123 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h"] Apr 16 20:26:49.129223 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:49.129106 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:26:49.146188 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:49.137453 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:26:54.709891 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:54.709837 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:54.710343 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:54.709907 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:54.713278 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:54.713253 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:26:55.233104 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:26:55.233074 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:27:06.279327 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:06.279287 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" event={"ID":"4e4324a4-5100-48bb-8c83-79191f4a4919","Type":"ContainerStarted","Data":"74331dd0ba0e51ae9a955587b639abaa13d99ac0ab5f2b3cd82d634d41de5167"} Apr 16 20:27:06.279829 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:06.279497 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" podUID="4e4324a4-5100-48bb-8c83-79191f4a4919" containerName="main" containerID="cri-o://74331dd0ba0e51ae9a955587b639abaa13d99ac0ab5f2b3cd82d634d41de5167" gracePeriod=30 Apr 16 20:27:06.299710 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:06.299657 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" podStartSLOduration=6.792289697 podStartE2EDuration="50.299641888s" podCreationTimestamp="2026-04-16 20:26:16 +0000 UTC" firstStartedPulling="2026-04-16 20:26:22.101333688 +0000 UTC m=+873.538322175" lastFinishedPulling="2026-04-16 20:27:05.608685865 +0000 UTC m=+917.045674366" observedRunningTime="2026-04-16 20:27:06.297414824 +0000 UTC m=+917.734403332" watchObservedRunningTime="2026-04-16 20:27:06.299641888 +0000 UTC m=+917.736630396" Apr 16 20:27:06.837662 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:06.837624 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:27:16.237446 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:16.237416 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:27:36.387848 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.387813 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-5b5459ccc8-w596h_4e4324a4-5100-48bb-8c83-79191f4a4919/main/0.log" Apr 16 20:27:36.388238 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.388146 2572 generic.go:358] "Generic (PLEG): container finished" podID="4e4324a4-5100-48bb-8c83-79191f4a4919" containerID="74331dd0ba0e51ae9a955587b639abaa13d99ac0ab5f2b3cd82d634d41de5167" exitCode=137 Apr 16 20:27:36.388238 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.388222 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" event={"ID":"4e4324a4-5100-48bb-8c83-79191f4a4919","Type":"ContainerDied","Data":"74331dd0ba0e51ae9a955587b639abaa13d99ac0ab5f2b3cd82d634d41de5167"} Apr 16 20:27:36.569333 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.569310 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-5b5459ccc8-w596h_4e4324a4-5100-48bb-8c83-79191f4a4919/main/0.log" Apr 16 20:27:36.569662 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.569645 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:27:36.670090 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.670052 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-model-cache\") pod \"4e4324a4-5100-48bb-8c83-79191f4a4919\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " Apr 16 20:27:36.670284 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.670128 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw5qm\" (UniqueName: \"kubernetes.io/projected/4e4324a4-5100-48bb-8c83-79191f4a4919-kube-api-access-pw5qm\") pod \"4e4324a4-5100-48bb-8c83-79191f4a4919\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " Apr 16 20:27:36.670284 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.670186 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-dshm\") pod \"4e4324a4-5100-48bb-8c83-79191f4a4919\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " Apr 16 20:27:36.670383 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.670317 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-model-cache" (OuterVolumeSpecName: "model-cache") pod "4e4324a4-5100-48bb-8c83-79191f4a4919" (UID: "4e4324a4-5100-48bb-8c83-79191f4a4919"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:27:36.670383 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.670325 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-kserve-provision-location\") pod \"4e4324a4-5100-48bb-8c83-79191f4a4919\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " Apr 16 20:27:36.670383 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.670364 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-home\") pod \"4e4324a4-5100-48bb-8c83-79191f4a4919\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " Apr 16 20:27:36.670548 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.670398 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4e4324a4-5100-48bb-8c83-79191f4a4919-tls-certs\") pod \"4e4324a4-5100-48bb-8c83-79191f4a4919\" (UID: \"4e4324a4-5100-48bb-8c83-79191f4a4919\") " Apr 16 20:27:36.670617 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.670596 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-home" (OuterVolumeSpecName: "home") pod "4e4324a4-5100-48bb-8c83-79191f4a4919" (UID: "4e4324a4-5100-48bb-8c83-79191f4a4919"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:27:36.670729 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.670704 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-model-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:27:36.670729 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.670728 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-home\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:27:36.672416 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.672385 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e4324a4-5100-48bb-8c83-79191f4a4919-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4e4324a4-5100-48bb-8c83-79191f4a4919" (UID: "4e4324a4-5100-48bb-8c83-79191f4a4919"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:27:36.672416 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.672394 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e4324a4-5100-48bb-8c83-79191f4a4919-kube-api-access-pw5qm" (OuterVolumeSpecName: "kube-api-access-pw5qm") pod "4e4324a4-5100-48bb-8c83-79191f4a4919" (UID: "4e4324a4-5100-48bb-8c83-79191f4a4919"). InnerVolumeSpecName "kube-api-access-pw5qm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:27:36.672538 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.672463 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-dshm" (OuterVolumeSpecName: "dshm") pod "4e4324a4-5100-48bb-8c83-79191f4a4919" (UID: "4e4324a4-5100-48bb-8c83-79191f4a4919"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:27:36.725477 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.725444 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4e4324a4-5100-48bb-8c83-79191f4a4919" (UID: "4e4324a4-5100-48bb-8c83-79191f4a4919"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:27:36.771578 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.771547 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:27:36.771578 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.771575 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4e4324a4-5100-48bb-8c83-79191f4a4919-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:27:36.771788 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.771586 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pw5qm\" (UniqueName: \"kubernetes.io/projected/4e4324a4-5100-48bb-8c83-79191f4a4919-kube-api-access-pw5qm\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:27:36.771788 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:36.771596 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4e4324a4-5100-48bb-8c83-79191f4a4919-dshm\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:27:37.393375 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:37.393346 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-5b5459ccc8-w596h_4e4324a4-5100-48bb-8c83-79191f4a4919/main/0.log" Apr 16 20:27:37.394014 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:37.393716 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" event={"ID":"4e4324a4-5100-48bb-8c83-79191f4a4919","Type":"ContainerDied","Data":"5ea4a615b0316b87598bc4e96b57dbd741de94b970066d12bfdc38c15e24e7c2"} Apr 16 20:27:37.394014 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:37.393756 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h" Apr 16 20:27:37.394014 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:37.393759 2572 scope.go:117] "RemoveContainer" containerID="74331dd0ba0e51ae9a955587b639abaa13d99ac0ab5f2b3cd82d634d41de5167" Apr 16 20:27:37.402317 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:37.402297 2572 scope.go:117] "RemoveContainer" containerID="6d0f87520fdad22aa0ae25c645d3029f9cbd1741d0c011a90a63607d8e70d395" Apr 16 20:27:37.412001 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:37.411949 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h"] Apr 16 20:27:37.416357 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:37.416336 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5b5459ccc8-w596h"] Apr 16 20:27:39.149435 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:39.149402 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e4324a4-5100-48bb-8c83-79191f4a4919" path="/var/lib/kubelet/pods/4e4324a4-5100-48bb-8c83-79191f4a4919/volumes" Apr 16 20:27:41.428172 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.428138 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx"] Apr 16 20:27:41.428586 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.428565 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e4324a4-5100-48bb-8c83-79191f4a4919" containerName="main" Apr 16 20:27:41.428586 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.428580 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e4324a4-5100-48bb-8c83-79191f4a4919" containerName="main" Apr 16 20:27:41.428664 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.428598 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e4324a4-5100-48bb-8c83-79191f4a4919" containerName="storage-initializer" Apr 16 20:27:41.428664 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.428604 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e4324a4-5100-48bb-8c83-79191f4a4919" containerName="storage-initializer" Apr 16 20:27:41.428727 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.428668 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e4324a4-5100-48bb-8c83-79191f4a4919" containerName="main" Apr 16 20:27:41.438136 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.438116 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.440187 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.440167 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 20:27:41.445346 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.445321 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx"] Apr 16 20:27:41.614882 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.614832 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znm57\" (UniqueName: \"kubernetes.io/projected/12860fc5-7258-4597-b12e-35bb66862a3a-kube-api-access-znm57\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.615060 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.614945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-model-cache\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.615060 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.615006 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-dshm\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.615060 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.615041 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12860fc5-7258-4597-b12e-35bb66862a3a-tls-certs\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.615184 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.615073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.615184 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.615169 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-home\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.716476 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.716380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-home\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.716775 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.716746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znm57\" (UniqueName: \"kubernetes.io/projected/12860fc5-7258-4597-b12e-35bb66862a3a-kube-api-access-znm57\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.716969 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.716945 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-model-cache\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.717135 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.717119 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-dshm\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.717322 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.717308 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12860fc5-7258-4597-b12e-35bb66862a3a-tls-certs\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.717430 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.717415 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.717625 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.717598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-model-cache\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.717776 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.717369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-home\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.718069 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.718042 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.723646 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.723619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-dshm\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.725396 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.725369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12860fc5-7258-4597-b12e-35bb66862a3a-tls-certs\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.725768 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.725751 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znm57\" (UniqueName: \"kubernetes.io/projected/12860fc5-7258-4597-b12e-35bb66862a3a-kube-api-access-znm57\") pod \"custom-route-timeout-test-kserve-75fc5648bf-n6kwx\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.749666 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.749644 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:41.878507 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:41.878481 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx"] Apr 16 20:27:41.881208 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:27:41.881167 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12860fc5_7258_4597_b12e_35bb66862a3a.slice/crio-82c450f0c38370daafc526a6a3695e4e8c91a012204cdb101e3d636ffce6ad46 WatchSource:0}: Error finding container 82c450f0c38370daafc526a6a3695e4e8c91a012204cdb101e3d636ffce6ad46: Status 404 returned error can't find the container with id 82c450f0c38370daafc526a6a3695e4e8c91a012204cdb101e3d636ffce6ad46 Apr 16 20:27:42.414864 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:42.414826 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" event={"ID":"12860fc5-7258-4597-b12e-35bb66862a3a","Type":"ContainerStarted","Data":"bcfee84eab76da46d621f91d5d47f4844f1807e31d14593c434c8f238e7727ef"} Apr 16 20:27:42.414864 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:42.414866 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" event={"ID":"12860fc5-7258-4597-b12e-35bb66862a3a","Type":"ContainerStarted","Data":"82c450f0c38370daafc526a6a3695e4e8c91a012204cdb101e3d636ffce6ad46"} Apr 16 20:27:46.435304 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:46.435221 2572 generic.go:358] "Generic (PLEG): container finished" podID="12860fc5-7258-4597-b12e-35bb66862a3a" containerID="bcfee84eab76da46d621f91d5d47f4844f1807e31d14593c434c8f238e7727ef" exitCode=0 Apr 16 20:27:46.435695 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:46.435295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" event={"ID":"12860fc5-7258-4597-b12e-35bb66862a3a","Type":"ContainerDied","Data":"bcfee84eab76da46d621f91d5d47f4844f1807e31d14593c434c8f238e7727ef"} Apr 16 20:27:47.441244 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:47.441201 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" event={"ID":"12860fc5-7258-4597-b12e-35bb66862a3a","Type":"ContainerStarted","Data":"a248f52032caec82764fc35da57318889880a46db7af0ad3f6b613142a6999fe"} Apr 16 20:27:47.460478 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:47.460419 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" podStartSLOduration=6.460399109 podStartE2EDuration="6.460399109s" podCreationTimestamp="2026-04-16 20:27:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:27:47.459278416 +0000 UTC m=+958.896266924" watchObservedRunningTime="2026-04-16 20:27:47.460399109 +0000 UTC m=+958.897387620" Apr 16 20:27:51.750634 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:51.750596 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:51.751155 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:51.750644 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:27:51.752330 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:27:51.752300 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 20:28:01.750960 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:01.750850 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 20:28:11.751015 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:11.750961 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 20:28:21.751182 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:21.751137 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 20:28:25.781151 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:25.781115 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq"] Apr 16 20:28:25.781693 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:25.781522 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" podUID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerName="main" containerID="cri-o://58fbae07aa6172ace3a97bf78630daae8846304f77aaa645844801d211da4900" gracePeriod=30 Apr 16 20:28:25.781693 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:25.781623 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" podUID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerName="tokenizer" containerID="cri-o://ef00335b0c36762efbcad7204dead832465d7d00c0d24271237b0a0c707ea692" gracePeriod=30 Apr 16 20:28:26.236310 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:28:26.236224 2572 logging.go:55] [core] [Channel #105 SubChannel #106]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.39:9003", ServerName: "10.134.0.39:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.39:9003: connect: connection refused" Apr 16 20:28:26.583356 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:26.583323 2572 generic.go:358] "Generic (PLEG): container finished" podID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerID="58fbae07aa6172ace3a97bf78630daae8846304f77aaa645844801d211da4900" exitCode=0 Apr 16 20:28:26.583525 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:26.583394 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" event={"ID":"c132d908-8c99-4d84-ada6-0f8a65b31eff","Type":"ContainerDied","Data":"58fbae07aa6172ace3a97bf78630daae8846304f77aaa645844801d211da4900"} Apr 16 20:28:27.048914 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.048867 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:28:27.140682 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.140593 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c132d908-8c99-4d84-ada6-0f8a65b31eff-tls-certs\") pod \"c132d908-8c99-4d84-ada6-0f8a65b31eff\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " Apr 16 20:28:27.140682 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.140643 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-tmp\") pod \"c132d908-8c99-4d84-ada6-0f8a65b31eff\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " Apr 16 20:28:27.140946 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.140699 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffpjw\" (UniqueName: \"kubernetes.io/projected/c132d908-8c99-4d84-ada6-0f8a65b31eff-kube-api-access-ffpjw\") pod \"c132d908-8c99-4d84-ada6-0f8a65b31eff\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " Apr 16 20:28:27.140946 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.140722 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-kserve-provision-location\") pod \"c132d908-8c99-4d84-ada6-0f8a65b31eff\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " Apr 16 20:28:27.140946 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.140771 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-cache\") pod \"c132d908-8c99-4d84-ada6-0f8a65b31eff\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " Apr 16 20:28:27.140946 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.140809 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-uds\") pod \"c132d908-8c99-4d84-ada6-0f8a65b31eff\" (UID: \"c132d908-8c99-4d84-ada6-0f8a65b31eff\") " Apr 16 20:28:27.141205 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.141138 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c132d908-8c99-4d84-ada6-0f8a65b31eff" (UID: "c132d908-8c99-4d84-ada6-0f8a65b31eff"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:27.141303 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.141201 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c132d908-8c99-4d84-ada6-0f8a65b31eff" (UID: "c132d908-8c99-4d84-ada6-0f8a65b31eff"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:27.141303 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.141255 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c132d908-8c99-4d84-ada6-0f8a65b31eff" (UID: "c132d908-8c99-4d84-ada6-0f8a65b31eff"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:27.141636 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.141608 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c132d908-8c99-4d84-ada6-0f8a65b31eff" (UID: "c132d908-8c99-4d84-ada6-0f8a65b31eff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:27.142839 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.142810 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c132d908-8c99-4d84-ada6-0f8a65b31eff-kube-api-access-ffpjw" (OuterVolumeSpecName: "kube-api-access-ffpjw") pod "c132d908-8c99-4d84-ada6-0f8a65b31eff" (UID: "c132d908-8c99-4d84-ada6-0f8a65b31eff"). InnerVolumeSpecName "kube-api-access-ffpjw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:28:27.143021 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.142859 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c132d908-8c99-4d84-ada6-0f8a65b31eff-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c132d908-8c99-4d84-ada6-0f8a65b31eff" (UID: "c132d908-8c99-4d84-ada6-0f8a65b31eff"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:28:27.236150 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.236103 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" podUID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.39:9003\" within 1s: context deadline exceeded" Apr 16 20:28:27.241977 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.241950 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:28:27.242081 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.241982 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-uds\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:28:27.242081 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.241997 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c132d908-8c99-4d84-ada6-0f8a65b31eff-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:28:27.242081 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.242011 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-tokenizer-tmp\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:28:27.242081 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.242025 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ffpjw\" (UniqueName: \"kubernetes.io/projected/c132d908-8c99-4d84-ada6-0f8a65b31eff-kube-api-access-ffpjw\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:28:27.242081 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.242041 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c132d908-8c99-4d84-ada6-0f8a65b31eff-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:28:27.589046 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.589011 2572 generic.go:358] "Generic (PLEG): container finished" podID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerID="ef00335b0c36762efbcad7204dead832465d7d00c0d24271237b0a0c707ea692" exitCode=0 Apr 16 20:28:27.589256 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.589085 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" event={"ID":"c132d908-8c99-4d84-ada6-0f8a65b31eff","Type":"ContainerDied","Data":"ef00335b0c36762efbcad7204dead832465d7d00c0d24271237b0a0c707ea692"} Apr 16 20:28:27.589256 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.589102 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" Apr 16 20:28:27.589256 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.589134 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq" event={"ID":"c132d908-8c99-4d84-ada6-0f8a65b31eff","Type":"ContainerDied","Data":"5670b7377bac0bb7d999fe06bbc06435417ae776891036dae3edea16f2d90d0c"} Apr 16 20:28:27.589256 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.589156 2572 scope.go:117] "RemoveContainer" containerID="ef00335b0c36762efbcad7204dead832465d7d00c0d24271237b0a0c707ea692" Apr 16 20:28:27.597751 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.597733 2572 scope.go:117] "RemoveContainer" containerID="58fbae07aa6172ace3a97bf78630daae8846304f77aaa645844801d211da4900" Apr 16 20:28:27.605525 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.605501 2572 scope.go:117] "RemoveContainer" containerID="cf4802435ede99055e918420cfe202f7d7ebf877da5827127191db8517712f03" Apr 16 20:28:27.607507 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.607483 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq"] Apr 16 20:28:27.613417 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.613397 2572 scope.go:117] "RemoveContainer" containerID="ef00335b0c36762efbcad7204dead832465d7d00c0d24271237b0a0c707ea692" Apr 16 20:28:27.613744 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:28:27.613718 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef00335b0c36762efbcad7204dead832465d7d00c0d24271237b0a0c707ea692\": container with ID starting with ef00335b0c36762efbcad7204dead832465d7d00c0d24271237b0a0c707ea692 not found: ID does not exist" containerID="ef00335b0c36762efbcad7204dead832465d7d00c0d24271237b0a0c707ea692" Apr 16 20:28:27.613824 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.613759 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef00335b0c36762efbcad7204dead832465d7d00c0d24271237b0a0c707ea692"} err="failed to get container status \"ef00335b0c36762efbcad7204dead832465d7d00c0d24271237b0a0c707ea692\": rpc error: code = NotFound desc = could not find container \"ef00335b0c36762efbcad7204dead832465d7d00c0d24271237b0a0c707ea692\": container with ID starting with ef00335b0c36762efbcad7204dead832465d7d00c0d24271237b0a0c707ea692 not found: ID does not exist" Apr 16 20:28:27.613824 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.613779 2572 scope.go:117] "RemoveContainer" containerID="58fbae07aa6172ace3a97bf78630daae8846304f77aaa645844801d211da4900" Apr 16 20:28:27.613824 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.613729 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-hhzzq"] Apr 16 20:28:27.614065 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:28:27.614046 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58fbae07aa6172ace3a97bf78630daae8846304f77aaa645844801d211da4900\": container with ID starting with 58fbae07aa6172ace3a97bf78630daae8846304f77aaa645844801d211da4900 not found: ID does not exist" containerID="58fbae07aa6172ace3a97bf78630daae8846304f77aaa645844801d211da4900" Apr 16 20:28:27.614140 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.614074 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58fbae07aa6172ace3a97bf78630daae8846304f77aaa645844801d211da4900"} err="failed to get container status \"58fbae07aa6172ace3a97bf78630daae8846304f77aaa645844801d211da4900\": rpc error: code = NotFound desc = could not find container \"58fbae07aa6172ace3a97bf78630daae8846304f77aaa645844801d211da4900\": container with ID starting with 58fbae07aa6172ace3a97bf78630daae8846304f77aaa645844801d211da4900 not found: ID does not exist" Apr 16 20:28:27.614140 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.614097 2572 scope.go:117] "RemoveContainer" containerID="cf4802435ede99055e918420cfe202f7d7ebf877da5827127191db8517712f03" Apr 16 20:28:27.614347 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:28:27.614329 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf4802435ede99055e918420cfe202f7d7ebf877da5827127191db8517712f03\": container with ID starting with cf4802435ede99055e918420cfe202f7d7ebf877da5827127191db8517712f03 not found: ID does not exist" containerID="cf4802435ede99055e918420cfe202f7d7ebf877da5827127191db8517712f03" Apr 16 20:28:27.614389 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:27.614352 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4802435ede99055e918420cfe202f7d7ebf877da5827127191db8517712f03"} err="failed to get container status \"cf4802435ede99055e918420cfe202f7d7ebf877da5827127191db8517712f03\": rpc error: code = NotFound desc = could not find container \"cf4802435ede99055e918420cfe202f7d7ebf877da5827127191db8517712f03\": container with ID starting with cf4802435ede99055e918420cfe202f7d7ebf877da5827127191db8517712f03 not found: ID does not exist" Apr 16 20:28:29.151174 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:29.151138 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c132d908-8c99-4d84-ada6-0f8a65b31eff" path="/var/lib/kubelet/pods/c132d908-8c99-4d84-ada6-0f8a65b31eff/volumes" Apr 16 20:28:31.750301 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:31.750255 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 20:28:41.751143 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:41.751099 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 20:28:48.675542 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.675512 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882"] Apr 16 20:28:48.675976 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.675906 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerName="tokenizer" Apr 16 20:28:48.675976 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.675923 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerName="tokenizer" Apr 16 20:28:48.675976 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.675935 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerName="main" Apr 16 20:28:48.675976 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.675941 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerName="main" Apr 16 20:28:48.675976 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.675970 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerName="storage-initializer" Apr 16 20:28:48.675976 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.675978 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerName="storage-initializer" Apr 16 20:28:48.676299 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.676037 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerName="main" Apr 16 20:28:48.676299 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.676051 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c132d908-8c99-4d84-ada6-0f8a65b31eff" containerName="tokenizer" Apr 16 20:28:48.679610 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.679591 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.681956 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.681936 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 20:28:48.682059 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.681942 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-ftr9f\"" Apr 16 20:28:48.689470 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.689445 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882"] Apr 16 20:28:48.725190 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.725155 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.725431 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.725215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.725431 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.725270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg4q2\" (UniqueName: \"kubernetes.io/projected/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-kube-api-access-lg4q2\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.725431 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.725318 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.725431 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.725362 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.725608 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.725442 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.826697 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.826664 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.826697 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.826700 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lg4q2\" (UniqueName: \"kubernetes.io/projected/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-kube-api-access-lg4q2\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.826978 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.826726 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.826978 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.826749 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.826978 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.826799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.826978 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.826828 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.827269 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.827246 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.827340 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.827285 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.827340 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.827326 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.827430 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.827350 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.829861 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.829813 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.834144 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.834124 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg4q2\" (UniqueName: \"kubernetes.io/projected/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-kube-api-access-lg4q2\") pod \"stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:48.991176 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.991092 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-ftr9f\"" Apr 16 20:28:48.999311 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:48.999276 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:49.136096 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:49.136069 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882"] Apr 16 20:28:49.138269 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:28:49.138241 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bbc07fe_53a3_4a7b_ae08_8bc0e489e1d3.slice/crio-37c672b226b2b512f4aeb1b9b2a0a2dc40cd97ea2cc11d0e2810db7a706006ef WatchSource:0}: Error finding container 37c672b226b2b512f4aeb1b9b2a0a2dc40cd97ea2cc11d0e2810db7a706006ef: Status 404 returned error can't find the container with id 37c672b226b2b512f4aeb1b9b2a0a2dc40cd97ea2cc11d0e2810db7a706006ef Apr 16 20:28:49.676959 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:49.676911 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" event={"ID":"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3","Type":"ContainerStarted","Data":"6d00729728ea048c8e3a43e0a9cc5f91aa36c46ecede7cef8282f799a3be0706"} Apr 16 20:28:49.677340 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:49.676965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" event={"ID":"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3","Type":"ContainerStarted","Data":"37c672b226b2b512f4aeb1b9b2a0a2dc40cd97ea2cc11d0e2810db7a706006ef"} Apr 16 20:28:50.682561 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:50.682525 2572 generic.go:358] "Generic (PLEG): container finished" podID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerID="6d00729728ea048c8e3a43e0a9cc5f91aa36c46ecede7cef8282f799a3be0706" exitCode=0 Apr 16 20:28:50.682964 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:50.682611 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" event={"ID":"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3","Type":"ContainerDied","Data":"6d00729728ea048c8e3a43e0a9cc5f91aa36c46ecede7cef8282f799a3be0706"} Apr 16 20:28:51.691706 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:51.691660 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" event={"ID":"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3","Type":"ContainerStarted","Data":"49c76180b88a8d6a62f0bdfa6c0f6a406132155056398bce62bd55e6c47b1bca"} Apr 16 20:28:51.691706 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:51.691697 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" event={"ID":"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3","Type":"ContainerStarted","Data":"5509d4d17d21e9bdc2674b193cf6c6ba0bb95cafa53757b4300ee3cae77d7022"} Apr 16 20:28:51.692187 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:51.691894 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:51.715116 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:51.715047 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" podStartSLOduration=3.715023715 podStartE2EDuration="3.715023715s" podCreationTimestamp="2026-04-16 20:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:28:51.71224171 +0000 UTC m=+1023.149230218" watchObservedRunningTime="2026-04-16 20:28:51.715023715 +0000 UTC m=+1023.152012224" Apr 16 20:28:51.750792 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:51.750744 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 20:28:58.999960 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:58.999919 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:59.000442 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:58.999974 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:59.002976 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:59.002952 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:28:59.723643 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:28:59.723613 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:29:01.750830 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:01.750790 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 20:29:11.750180 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:11.750125 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 20:29:20.727172 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:20.727141 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:29:21.760640 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:21.760611 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:29:21.768740 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:21.768713 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:29:27.412072 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:27.411983 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx"] Apr 16 20:29:27.412472 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:27.412357 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="main" containerID="cri-o://a248f52032caec82764fc35da57318889880a46db7af0ad3f6b613142a6999fe" gracePeriod=30 Apr 16 20:29:57.650690 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.650667 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-75fc5648bf-n6kwx_12860fc5-7258-4597-b12e-35bb66862a3a/main/0.log" Apr 16 20:29:57.651065 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.651048 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:29:57.754394 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.754366 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-home\") pod \"12860fc5-7258-4597-b12e-35bb66862a3a\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " Apr 16 20:29:57.754586 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.754457 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-dshm\") pod \"12860fc5-7258-4597-b12e-35bb66862a3a\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " Apr 16 20:29:57.754586 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.754493 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znm57\" (UniqueName: \"kubernetes.io/projected/12860fc5-7258-4597-b12e-35bb66862a3a-kube-api-access-znm57\") pod \"12860fc5-7258-4597-b12e-35bb66862a3a\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " Apr 16 20:29:57.754586 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.754521 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-model-cache\") pod \"12860fc5-7258-4597-b12e-35bb66862a3a\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " Apr 16 20:29:57.754586 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.754554 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12860fc5-7258-4597-b12e-35bb66862a3a-tls-certs\") pod \"12860fc5-7258-4597-b12e-35bb66862a3a\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " Apr 16 20:29:57.754804 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.754606 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-kserve-provision-location\") pod \"12860fc5-7258-4597-b12e-35bb66862a3a\" (UID: \"12860fc5-7258-4597-b12e-35bb66862a3a\") " Apr 16 20:29:57.754804 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.754714 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-home" (OuterVolumeSpecName: "home") pod "12860fc5-7258-4597-b12e-35bb66862a3a" (UID: "12860fc5-7258-4597-b12e-35bb66862a3a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:29:57.754923 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.754837 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-model-cache" (OuterVolumeSpecName: "model-cache") pod "12860fc5-7258-4597-b12e-35bb66862a3a" (UID: "12860fc5-7258-4597-b12e-35bb66862a3a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:29:57.754999 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.754981 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-home\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:29:57.755052 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.755004 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-model-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:29:57.756669 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.756637 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12860fc5-7258-4597-b12e-35bb66862a3a-kube-api-access-znm57" (OuterVolumeSpecName: "kube-api-access-znm57") pod "12860fc5-7258-4597-b12e-35bb66862a3a" (UID: "12860fc5-7258-4597-b12e-35bb66862a3a"). InnerVolumeSpecName "kube-api-access-znm57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:29:57.756669 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.756655 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12860fc5-7258-4597-b12e-35bb66862a3a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "12860fc5-7258-4597-b12e-35bb66862a3a" (UID: "12860fc5-7258-4597-b12e-35bb66862a3a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:29:57.756836 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.756801 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-dshm" (OuterVolumeSpecName: "dshm") pod "12860fc5-7258-4597-b12e-35bb66862a3a" (UID: "12860fc5-7258-4597-b12e-35bb66862a3a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:29:57.811483 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.811446 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "12860fc5-7258-4597-b12e-35bb66862a3a" (UID: "12860fc5-7258-4597-b12e-35bb66862a3a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:29:57.855865 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.855825 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:29:57.855865 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.855852 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/12860fc5-7258-4597-b12e-35bb66862a3a-dshm\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:29:57.855865 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.855864 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-znm57\" (UniqueName: \"kubernetes.io/projected/12860fc5-7258-4597-b12e-35bb66862a3a-kube-api-access-znm57\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:29:57.856185 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.855904 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12860fc5-7258-4597-b12e-35bb66862a3a-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:29:57.936718 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.936686 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-75fc5648bf-n6kwx_12860fc5-7258-4597-b12e-35bb66862a3a/main/0.log" Apr 16 20:29:57.937070 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.937044 2572 generic.go:358] "Generic (PLEG): container finished" podID="12860fc5-7258-4597-b12e-35bb66862a3a" containerID="a248f52032caec82764fc35da57318889880a46db7af0ad3f6b613142a6999fe" exitCode=137 Apr 16 20:29:57.937130 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.937107 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" event={"ID":"12860fc5-7258-4597-b12e-35bb66862a3a","Type":"ContainerDied","Data":"a248f52032caec82764fc35da57318889880a46db7af0ad3f6b613142a6999fe"} Apr 16 20:29:57.937172 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.937111 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" Apr 16 20:29:57.937172 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.937134 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx" event={"ID":"12860fc5-7258-4597-b12e-35bb66862a3a","Type":"ContainerDied","Data":"82c450f0c38370daafc526a6a3695e4e8c91a012204cdb101e3d636ffce6ad46"} Apr 16 20:29:57.937172 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.937150 2572 scope.go:117] "RemoveContainer" containerID="a248f52032caec82764fc35da57318889880a46db7af0ad3f6b613142a6999fe" Apr 16 20:29:57.956969 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.956928 2572 scope.go:117] "RemoveContainer" containerID="bcfee84eab76da46d621f91d5d47f4844f1807e31d14593c434c8f238e7727ef" Apr 16 20:29:57.958515 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.958497 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx"] Apr 16 20:29:57.962637 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.962614 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-75fc5648bf-n6kwx"] Apr 16 20:29:57.967713 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.967691 2572 scope.go:117] "RemoveContainer" containerID="a248f52032caec82764fc35da57318889880a46db7af0ad3f6b613142a6999fe" Apr 16 20:29:57.967959 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:29:57.967943 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a248f52032caec82764fc35da57318889880a46db7af0ad3f6b613142a6999fe\": container with ID starting with a248f52032caec82764fc35da57318889880a46db7af0ad3f6b613142a6999fe not found: ID does not exist" containerID="a248f52032caec82764fc35da57318889880a46db7af0ad3f6b613142a6999fe" Apr 16 20:29:57.968019 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.967967 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a248f52032caec82764fc35da57318889880a46db7af0ad3f6b613142a6999fe"} err="failed to get container status \"a248f52032caec82764fc35da57318889880a46db7af0ad3f6b613142a6999fe\": rpc error: code = NotFound desc = could not find container \"a248f52032caec82764fc35da57318889880a46db7af0ad3f6b613142a6999fe\": container with ID starting with a248f52032caec82764fc35da57318889880a46db7af0ad3f6b613142a6999fe not found: ID does not exist" Apr 16 20:29:57.968019 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.967985 2572 scope.go:117] "RemoveContainer" containerID="bcfee84eab76da46d621f91d5d47f4844f1807e31d14593c434c8f238e7727ef" Apr 16 20:29:57.968243 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:29:57.968223 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcfee84eab76da46d621f91d5d47f4844f1807e31d14593c434c8f238e7727ef\": container with ID starting with bcfee84eab76da46d621f91d5d47f4844f1807e31d14593c434c8f238e7727ef not found: ID does not exist" containerID="bcfee84eab76da46d621f91d5d47f4844f1807e31d14593c434c8f238e7727ef" Apr 16 20:29:57.968291 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:57.968252 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcfee84eab76da46d621f91d5d47f4844f1807e31d14593c434c8f238e7727ef"} err="failed to get container status \"bcfee84eab76da46d621f91d5d47f4844f1807e31d14593c434c8f238e7727ef\": rpc error: code = NotFound desc = could not find container \"bcfee84eab76da46d621f91d5d47f4844f1807e31d14593c434c8f238e7727ef\": container with ID starting with bcfee84eab76da46d621f91d5d47f4844f1807e31d14593c434c8f238e7727ef not found: ID does not exist" Apr 16 20:29:59.148617 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:29:59.148581 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" path="/var/lib/kubelet/pods/12860fc5-7258-4597-b12e-35bb66862a3a/volumes" Apr 16 20:30:30.335702 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:30.335664 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882"] Apr 16 20:30:30.336249 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:30.335997 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" podUID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerName="main" containerID="cri-o://5509d4d17d21e9bdc2674b193cf6c6ba0bb95cafa53757b4300ee3cae77d7022" gracePeriod=30 Apr 16 20:30:30.336249 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:30.336058 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" podUID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerName="tokenizer" containerID="cri-o://49c76180b88a8d6a62f0bdfa6c0f6a406132155056398bce62bd55e6c47b1bca" gracePeriod=30 Apr 16 20:30:30.726926 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:30:30.726806 2572 logging.go:55] [core] [Channel #158 SubChannel #159]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.41:9003", ServerName: "10.134.0.41:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.41:9003: connect: connection refused" Apr 16 20:30:31.057986 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.057947 2572 generic.go:358] "Generic (PLEG): container finished" podID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerID="5509d4d17d21e9bdc2674b193cf6c6ba0bb95cafa53757b4300ee3cae77d7022" exitCode=0 Apr 16 20:30:31.058172 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.058020 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" event={"ID":"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3","Type":"ContainerDied","Data":"5509d4d17d21e9bdc2674b193cf6c6ba0bb95cafa53757b4300ee3cae77d7022"} Apr 16 20:30:31.690893 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.690855 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:30:31.727136 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.727047 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" podUID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.41:9003\" within 1s: context deadline exceeded" Apr 16 20:30:31.753070 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.753034 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-uds\") pod \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " Apr 16 20:30:31.753220 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.753081 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-cache\") pod \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " Apr 16 20:30:31.753220 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.753111 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-tmp\") pod \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " Apr 16 20:30:31.753220 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.753135 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tls-certs\") pod \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " Apr 16 20:30:31.753220 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.753161 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg4q2\" (UniqueName: \"kubernetes.io/projected/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-kube-api-access-lg4q2\") pod \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " Apr 16 20:30:31.753413 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.753231 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-kserve-provision-location\") pod \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\" (UID: \"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3\") " Apr 16 20:30:31.753413 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.753273 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" (UID: "3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:30:31.753413 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.753396 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" (UID: "3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:30:31.753569 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.753523 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-uds\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:30:31.753569 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.753478 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" (UID: "3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:30:31.753569 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.753538 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:30:31.754001 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.753979 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" (UID: "3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:30:31.755329 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.755306 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" (UID: "3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:30:31.755405 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.755377 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-kube-api-access-lg4q2" (OuterVolumeSpecName: "kube-api-access-lg4q2") pod "3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" (UID: "3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3"). InnerVolumeSpecName "kube-api-access-lg4q2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:30:31.855018 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.854974 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tokenizer-tmp\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:30:31.855018 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.855011 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:30:31.855018 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.855025 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lg4q2\" (UniqueName: \"kubernetes.io/projected/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-kube-api-access-lg4q2\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:30:31.855257 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:31.855040 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:30:32.063768 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.063728 2572 generic.go:358] "Generic (PLEG): container finished" podID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerID="49c76180b88a8d6a62f0bdfa6c0f6a406132155056398bce62bd55e6c47b1bca" exitCode=0 Apr 16 20:30:32.063950 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.063812 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" Apr 16 20:30:32.063950 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.063803 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" event={"ID":"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3","Type":"ContainerDied","Data":"49c76180b88a8d6a62f0bdfa6c0f6a406132155056398bce62bd55e6c47b1bca"} Apr 16 20:30:32.063950 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.063911 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882" event={"ID":"3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3","Type":"ContainerDied","Data":"37c672b226b2b512f4aeb1b9b2a0a2dc40cd97ea2cc11d0e2810db7a706006ef"} Apr 16 20:30:32.063950 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.063929 2572 scope.go:117] "RemoveContainer" containerID="49c76180b88a8d6a62f0bdfa6c0f6a406132155056398bce62bd55e6c47b1bca" Apr 16 20:30:32.072452 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.072435 2572 scope.go:117] "RemoveContainer" containerID="5509d4d17d21e9bdc2674b193cf6c6ba0bb95cafa53757b4300ee3cae77d7022" Apr 16 20:30:32.080060 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.080037 2572 scope.go:117] "RemoveContainer" containerID="6d00729728ea048c8e3a43e0a9cc5f91aa36c46ecede7cef8282f799a3be0706" Apr 16 20:30:32.087754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.087724 2572 scope.go:117] "RemoveContainer" containerID="49c76180b88a8d6a62f0bdfa6c0f6a406132155056398bce62bd55e6c47b1bca" Apr 16 20:30:32.088098 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:30:32.088070 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c76180b88a8d6a62f0bdfa6c0f6a406132155056398bce62bd55e6c47b1bca\": container with ID starting with 49c76180b88a8d6a62f0bdfa6c0f6a406132155056398bce62bd55e6c47b1bca not found: ID does not exist" containerID="49c76180b88a8d6a62f0bdfa6c0f6a406132155056398bce62bd55e6c47b1bca" Apr 16 20:30:32.088192 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.088110 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c76180b88a8d6a62f0bdfa6c0f6a406132155056398bce62bd55e6c47b1bca"} err="failed to get container status \"49c76180b88a8d6a62f0bdfa6c0f6a406132155056398bce62bd55e6c47b1bca\": rpc error: code = NotFound desc = could not find container \"49c76180b88a8d6a62f0bdfa6c0f6a406132155056398bce62bd55e6c47b1bca\": container with ID starting with 49c76180b88a8d6a62f0bdfa6c0f6a406132155056398bce62bd55e6c47b1bca not found: ID does not exist" Apr 16 20:30:32.088192 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.088136 2572 scope.go:117] "RemoveContainer" containerID="5509d4d17d21e9bdc2674b193cf6c6ba0bb95cafa53757b4300ee3cae77d7022" Apr 16 20:30:32.088404 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:30:32.088380 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5509d4d17d21e9bdc2674b193cf6c6ba0bb95cafa53757b4300ee3cae77d7022\": container with ID starting with 5509d4d17d21e9bdc2674b193cf6c6ba0bb95cafa53757b4300ee3cae77d7022 not found: ID does not exist" containerID="5509d4d17d21e9bdc2674b193cf6c6ba0bb95cafa53757b4300ee3cae77d7022" Apr 16 20:30:32.088464 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.088414 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5509d4d17d21e9bdc2674b193cf6c6ba0bb95cafa53757b4300ee3cae77d7022"} err="failed to get container status \"5509d4d17d21e9bdc2674b193cf6c6ba0bb95cafa53757b4300ee3cae77d7022\": rpc error: code = NotFound desc = could not find container \"5509d4d17d21e9bdc2674b193cf6c6ba0bb95cafa53757b4300ee3cae77d7022\": container with ID starting with 5509d4d17d21e9bdc2674b193cf6c6ba0bb95cafa53757b4300ee3cae77d7022 not found: ID does not exist" Apr 16 20:30:32.088464 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.088442 2572 scope.go:117] "RemoveContainer" containerID="6d00729728ea048c8e3a43e0a9cc5f91aa36c46ecede7cef8282f799a3be0706" Apr 16 20:30:32.088678 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:30:32.088661 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d00729728ea048c8e3a43e0a9cc5f91aa36c46ecede7cef8282f799a3be0706\": container with ID starting with 6d00729728ea048c8e3a43e0a9cc5f91aa36c46ecede7cef8282f799a3be0706 not found: ID does not exist" containerID="6d00729728ea048c8e3a43e0a9cc5f91aa36c46ecede7cef8282f799a3be0706" Apr 16 20:30:32.088737 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.088686 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d00729728ea048c8e3a43e0a9cc5f91aa36c46ecede7cef8282f799a3be0706"} err="failed to get container status \"6d00729728ea048c8e3a43e0a9cc5f91aa36c46ecede7cef8282f799a3be0706\": rpc error: code = NotFound desc = could not find container \"6d00729728ea048c8e3a43e0a9cc5f91aa36c46ecede7cef8282f799a3be0706\": container with ID starting with 6d00729728ea048c8e3a43e0a9cc5f91aa36c46ecede7cef8282f799a3be0706 not found: ID does not exist" Apr 16 20:30:32.089137 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.089112 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882"] Apr 16 20:30:32.092808 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.092785 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-668f7b79bc-w9882"] Apr 16 20:30:32.541543 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.541502 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-549c5f665b-98qgl"] Apr 16 20:30:32.541930 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.541917 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="main" Apr 16 20:30:32.541987 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.541932 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="main" Apr 16 20:30:32.541987 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.541944 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerName="storage-initializer" Apr 16 20:30:32.541987 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.541950 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerName="storage-initializer" Apr 16 20:30:32.541987 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.541966 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerName="main" Apr 16 20:30:32.541987 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.541974 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerName="main" Apr 16 20:30:32.541987 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.541985 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="storage-initializer" Apr 16 20:30:32.542194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.541991 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="storage-initializer" Apr 16 20:30:32.542194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.541997 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerName="tokenizer" Apr 16 20:30:32.542194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.542003 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerName="tokenizer" Apr 16 20:30:32.542194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.542056 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="12860fc5-7258-4597-b12e-35bb66862a3a" containerName="main" Apr 16 20:30:32.542194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.542067 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerName="main" Apr 16 20:30:32.542194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.542074 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" containerName="tokenizer" Apr 16 20:30:32.546645 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.546623 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-549c5f665b-98qgl" Apr 16 20:30:32.551854 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.551827 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-549c5f665b-98qgl"] Apr 16 20:30:32.662580 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.662539 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tvrk\" (UniqueName: \"kubernetes.io/projected/7e560579-1b0b-46ef-a7e6-1944f268ce2a-kube-api-access-7tvrk\") pod \"llmisvc-controller-manager-549c5f665b-98qgl\" (UID: \"7e560579-1b0b-46ef-a7e6-1944f268ce2a\") " pod="kserve/llmisvc-controller-manager-549c5f665b-98qgl" Apr 16 20:30:32.662762 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.662594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e560579-1b0b-46ef-a7e6-1944f268ce2a-cert\") pod \"llmisvc-controller-manager-549c5f665b-98qgl\" (UID: \"7e560579-1b0b-46ef-a7e6-1944f268ce2a\") " pod="kserve/llmisvc-controller-manager-549c5f665b-98qgl" Apr 16 20:30:32.763998 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.763945 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tvrk\" (UniqueName: \"kubernetes.io/projected/7e560579-1b0b-46ef-a7e6-1944f268ce2a-kube-api-access-7tvrk\") pod \"llmisvc-controller-manager-549c5f665b-98qgl\" (UID: \"7e560579-1b0b-46ef-a7e6-1944f268ce2a\") " pod="kserve/llmisvc-controller-manager-549c5f665b-98qgl" Apr 16 20:30:32.763998 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.764013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e560579-1b0b-46ef-a7e6-1944f268ce2a-cert\") pod \"llmisvc-controller-manager-549c5f665b-98qgl\" (UID: \"7e560579-1b0b-46ef-a7e6-1944f268ce2a\") " pod="kserve/llmisvc-controller-manager-549c5f665b-98qgl" Apr 16 20:30:32.766428 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.766402 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e560579-1b0b-46ef-a7e6-1944f268ce2a-cert\") pod \"llmisvc-controller-manager-549c5f665b-98qgl\" (UID: \"7e560579-1b0b-46ef-a7e6-1944f268ce2a\") " pod="kserve/llmisvc-controller-manager-549c5f665b-98qgl" Apr 16 20:30:32.772100 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.772081 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tvrk\" (UniqueName: \"kubernetes.io/projected/7e560579-1b0b-46ef-a7e6-1944f268ce2a-kube-api-access-7tvrk\") pod \"llmisvc-controller-manager-549c5f665b-98qgl\" (UID: \"7e560579-1b0b-46ef-a7e6-1944f268ce2a\") " pod="kserve/llmisvc-controller-manager-549c5f665b-98qgl" Apr 16 20:30:32.858248 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.858154 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-549c5f665b-98qgl" Apr 16 20:30:32.980797 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.980766 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-549c5f665b-98qgl"] Apr 16 20:30:32.982838 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:30:32.982807 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7e560579_1b0b_46ef_a7e6_1944f268ce2a.slice/crio-324068795afce190caf777ed8a028d37aaba3fbf2b995ad03bc71943219238f0 WatchSource:0}: Error finding container 324068795afce190caf777ed8a028d37aaba3fbf2b995ad03bc71943219238f0: Status 404 returned error can't find the container with id 324068795afce190caf777ed8a028d37aaba3fbf2b995ad03bc71943219238f0 Apr 16 20:30:32.984134 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:32.984115 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:30:33.070081 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:33.070041 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-549c5f665b-98qgl" event={"ID":"7e560579-1b0b-46ef-a7e6-1944f268ce2a","Type":"ContainerStarted","Data":"324068795afce190caf777ed8a028d37aaba3fbf2b995ad03bc71943219238f0"} Apr 16 20:30:33.149673 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:33.149595 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3" path="/var/lib/kubelet/pods/3bbc07fe-53a3-4a7b-ae08-8bc0e489e1d3/volumes" Apr 16 20:30:34.076081 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:34.076044 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-549c5f665b-98qgl" event={"ID":"7e560579-1b0b-46ef-a7e6-1944f268ce2a","Type":"ContainerStarted","Data":"a58aaa85d23b20027c0b5e8a74ca860745493d76e7b6886ff009def36d36d9fd"} Apr 16 20:30:34.076509 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:34.076153 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-549c5f665b-98qgl" Apr 16 20:30:34.093080 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:30:34.093024 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-549c5f665b-98qgl" podStartSLOduration=1.6608768760000001 podStartE2EDuration="2.093009469s" podCreationTimestamp="2026-04-16 20:30:32 +0000 UTC" firstStartedPulling="2026-04-16 20:30:32.984236941 +0000 UTC m=+1124.421225427" lastFinishedPulling="2026-04-16 20:30:33.416369521 +0000 UTC m=+1124.853358020" observedRunningTime="2026-04-16 20:30:34.091367019 +0000 UTC m=+1125.528355538" watchObservedRunningTime="2026-04-16 20:30:34.093009469 +0000 UTC m=+1125.529997977" Apr 16 20:31:05.082118 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:05.082043 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-549c5f665b-98qgl" Apr 16 20:31:05.126368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:05.126335 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-6f8c758999-z8t8q"] Apr 16 20:31:05.126719 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:05.126656 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" podUID="b9555665-89e1-4156-be4c-187e0712741e" containerName="manager" containerID="cri-o://1a51da3ecf8a5c9097f993af75c6b4eaa230e07cf7df93713d68f525d12987d3" gracePeriod=30 Apr 16 20:31:05.370036 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:05.370012 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" Apr 16 20:31:05.447410 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:05.447380 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w52q\" (UniqueName: \"kubernetes.io/projected/b9555665-89e1-4156-be4c-187e0712741e-kube-api-access-5w52q\") pod \"b9555665-89e1-4156-be4c-187e0712741e\" (UID: \"b9555665-89e1-4156-be4c-187e0712741e\") " Apr 16 20:31:05.447571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:05.447463 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9555665-89e1-4156-be4c-187e0712741e-cert\") pod \"b9555665-89e1-4156-be4c-187e0712741e\" (UID: \"b9555665-89e1-4156-be4c-187e0712741e\") " Apr 16 20:31:05.449348 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:05.449320 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9555665-89e1-4156-be4c-187e0712741e-kube-api-access-5w52q" (OuterVolumeSpecName: "kube-api-access-5w52q") pod "b9555665-89e1-4156-be4c-187e0712741e" (UID: "b9555665-89e1-4156-be4c-187e0712741e"). InnerVolumeSpecName "kube-api-access-5w52q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:31:05.449441 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:05.449393 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9555665-89e1-4156-be4c-187e0712741e-cert" (OuterVolumeSpecName: "cert") pod "b9555665-89e1-4156-be4c-187e0712741e" (UID: "b9555665-89e1-4156-be4c-187e0712741e"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:05.548100 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:05.548067 2572 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9555665-89e1-4156-be4c-187e0712741e-cert\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:31:05.548100 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:05.548093 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5w52q\" (UniqueName: \"kubernetes.io/projected/b9555665-89e1-4156-be4c-187e0712741e-kube-api-access-5w52q\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:31:06.187619 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:06.187583 2572 generic.go:358] "Generic (PLEG): container finished" podID="b9555665-89e1-4156-be4c-187e0712741e" containerID="1a51da3ecf8a5c9097f993af75c6b4eaa230e07cf7df93713d68f525d12987d3" exitCode=0 Apr 16 20:31:06.188136 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:06.187645 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" Apr 16 20:31:06.188136 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:06.187662 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" event={"ID":"b9555665-89e1-4156-be4c-187e0712741e","Type":"ContainerDied","Data":"1a51da3ecf8a5c9097f993af75c6b4eaa230e07cf7df93713d68f525d12987d3"} Apr 16 20:31:06.188136 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:06.187699 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6f8c758999-z8t8q" event={"ID":"b9555665-89e1-4156-be4c-187e0712741e","Type":"ContainerDied","Data":"201044aeaf9c8b66f88a4940817b77657d9bee52dd1eae5fbd320f09755d718e"} Apr 16 20:31:06.188136 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:06.187714 2572 scope.go:117] "RemoveContainer" containerID="1a51da3ecf8a5c9097f993af75c6b4eaa230e07cf7df93713d68f525d12987d3" Apr 16 20:31:06.196527 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:06.196510 2572 scope.go:117] "RemoveContainer" containerID="1a51da3ecf8a5c9097f993af75c6b4eaa230e07cf7df93713d68f525d12987d3" Apr 16 20:31:06.196782 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:31:06.196764 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a51da3ecf8a5c9097f993af75c6b4eaa230e07cf7df93713d68f525d12987d3\": container with ID starting with 1a51da3ecf8a5c9097f993af75c6b4eaa230e07cf7df93713d68f525d12987d3 not found: ID does not exist" containerID="1a51da3ecf8a5c9097f993af75c6b4eaa230e07cf7df93713d68f525d12987d3" Apr 16 20:31:06.196833 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:06.196791 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a51da3ecf8a5c9097f993af75c6b4eaa230e07cf7df93713d68f525d12987d3"} err="failed to get container status \"1a51da3ecf8a5c9097f993af75c6b4eaa230e07cf7df93713d68f525d12987d3\": rpc error: code = NotFound desc = could not find container \"1a51da3ecf8a5c9097f993af75c6b4eaa230e07cf7df93713d68f525d12987d3\": container with ID starting with 1a51da3ecf8a5c9097f993af75c6b4eaa230e07cf7df93713d68f525d12987d3 not found: ID does not exist" Apr 16 20:31:06.206316 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:06.206296 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-6f8c758999-z8t8q"] Apr 16 20:31:06.209919 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:06.209896 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-6f8c758999-z8t8q"] Apr 16 20:31:07.148984 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:07.148947 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9555665-89e1-4156-be4c-187e0712741e" path="/var/lib/kubelet/pods/b9555665-89e1-4156-be4c-187e0712741e/volumes" Apr 16 20:31:39.348037 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.348001 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g"] Apr 16 20:31:39.348650 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.348443 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9555665-89e1-4156-be4c-187e0712741e" containerName="manager" Apr 16 20:31:39.348650 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.348458 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9555665-89e1-4156-be4c-187e0712741e" containerName="manager" Apr 16 20:31:39.348650 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.348522 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9555665-89e1-4156-be4c-187e0712741e" containerName="manager" Apr 16 20:31:39.351691 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.351669 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.354244 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.354224 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:31:39.354363 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.354227 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:31:39.354363 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.354227 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-854t4\"" Apr 16 20:31:39.354363 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.354227 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 20:31:39.354677 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.354650 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:31:39.369888 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.366311 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g"] Apr 16 20:31:39.446512 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.446477 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.446690 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.446539 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpsjl\" (UniqueName: \"kubernetes.io/projected/cda11da4-d358-4808-bde5-06bf2a8d7ad5-kube-api-access-cpsjl\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.446690 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.446611 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.446690 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.446646 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.446690 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.446668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.446826 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.446734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cda11da4-d358-4808-bde5-06bf2a8d7ad5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.547673 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.547638 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cda11da4-d358-4808-bde5-06bf2a8d7ad5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.547843 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.547695 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.547843 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.547746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpsjl\" (UniqueName: \"kubernetes.io/projected/cda11da4-d358-4808-bde5-06bf2a8d7ad5-kube-api-access-cpsjl\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.547843 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.547777 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.547843 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.547804 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.547843 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.547834 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.548260 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.548233 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.548329 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.548263 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.548329 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.548281 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.550072 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.550041 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.550241 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.550221 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cda11da4-d358-4808-bde5-06bf2a8d7ad5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.558548 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.558520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpsjl\" (UniqueName: \"kubernetes.io/projected/cda11da4-d358-4808-bde5-06bf2a8d7ad5-kube-api-access-cpsjl\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.663140 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.663043 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:39.790174 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:39.790147 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g"] Apr 16 20:31:39.792618 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:31:39.792588 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcda11da4_d358_4808_bde5_06bf2a8d7ad5.slice/crio-78ce45b33fd84e6da232c9a459150f319801af371bdcb15e76f1354e47c4c79d WatchSource:0}: Error finding container 78ce45b33fd84e6da232c9a459150f319801af371bdcb15e76f1354e47c4c79d: Status 404 returned error can't find the container with id 78ce45b33fd84e6da232c9a459150f319801af371bdcb15e76f1354e47c4c79d Apr 16 20:31:40.310825 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:40.310788 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" event={"ID":"cda11da4-d358-4808-bde5-06bf2a8d7ad5","Type":"ContainerStarted","Data":"78ce45b33fd84e6da232c9a459150f319801af371bdcb15e76f1354e47c4c79d"} Apr 16 20:31:41.315907 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:41.315853 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" event={"ID":"cda11da4-d358-4808-bde5-06bf2a8d7ad5","Type":"ContainerStarted","Data":"71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898"} Apr 16 20:31:41.316278 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:41.315978 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:42.320629 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:42.320587 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" event={"ID":"cda11da4-d358-4808-bde5-06bf2a8d7ad5","Type":"ContainerStarted","Data":"207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd"} Apr 16 20:31:45.333223 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:45.333188 2572 generic.go:358] "Generic (PLEG): container finished" podID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerID="207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd" exitCode=0 Apr 16 20:31:45.333684 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:45.333267 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" event={"ID":"cda11da4-d358-4808-bde5-06bf2a8d7ad5","Type":"ContainerDied","Data":"207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd"} Apr 16 20:31:46.338513 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:46.338473 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" event={"ID":"cda11da4-d358-4808-bde5-06bf2a8d7ad5","Type":"ContainerStarted","Data":"349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433"} Apr 16 20:31:46.361277 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:46.361224 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podStartSLOduration=6.535238677 podStartE2EDuration="7.361209214s" podCreationTimestamp="2026-04-16 20:31:39 +0000 UTC" firstStartedPulling="2026-04-16 20:31:39.79442318 +0000 UTC m=+1191.231411668" lastFinishedPulling="2026-04-16 20:31:40.620393702 +0000 UTC m=+1192.057382205" observedRunningTime="2026-04-16 20:31:46.359822185 +0000 UTC m=+1197.796810693" watchObservedRunningTime="2026-04-16 20:31:46.361209214 +0000 UTC m=+1197.798197722" Apr 16 20:31:49.159428 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:49.159327 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:31:49.178800 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:49.170022 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:31:49.664099 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:49.664062 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:49.664099 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:49.664105 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:31:49.665330 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:49.665296 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 16 20:31:51.119043 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.119003 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n"] Apr 16 20:31:51.125586 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.125560 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.127926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.127902 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 16 20:31:51.132375 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.132348 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n"] Apr 16 20:31:51.148388 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.148361 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.148658 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.148637 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.148778 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.148761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.148935 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.148919 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gt4j\" (UniqueName: \"kubernetes.io/projected/1c238870-e428-4fc7-9548-69965d5c1f5c-kube-api-access-5gt4j\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.149086 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.149069 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.149189 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.149176 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1c238870-e428-4fc7-9548-69965d5c1f5c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.250639 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.250605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.250838 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.250699 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.250838 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.250718 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.250838 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.250764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gt4j\" (UniqueName: \"kubernetes.io/projected/1c238870-e428-4fc7-9548-69965d5c1f5c-kube-api-access-5gt4j\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.250838 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.250820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.251103 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.250843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1c238870-e428-4fc7-9548-69965d5c1f5c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.251103 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.251083 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.251441 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.251384 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.251659 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.251630 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.253428 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.253403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1c238870-e428-4fc7-9548-69965d5c1f5c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.253527 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.253494 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.258519 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.258501 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gt4j\" (UniqueName: \"kubernetes.io/projected/1c238870-e428-4fc7-9548-69965d5c1f5c-kube-api-access-5gt4j\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.439111 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.439025 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:31:51.605069 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:51.605031 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n"] Apr 16 20:31:51.608641 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:31:51.608606 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c238870_e428_4fc7_9548_69965d5c1f5c.slice/crio-65abac2ca431dc85bf918289cc53c2441058106b86e53afbe63f6a2a28a67a6a WatchSource:0}: Error finding container 65abac2ca431dc85bf918289cc53c2441058106b86e53afbe63f6a2a28a67a6a: Status 404 returned error can't find the container with id 65abac2ca431dc85bf918289cc53c2441058106b86e53afbe63f6a2a28a67a6a Apr 16 20:31:52.361233 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:52.361186 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" event={"ID":"1c238870-e428-4fc7-9548-69965d5c1f5c","Type":"ContainerStarted","Data":"4c14f2edf35c0f7b2e10ba58ef3e6f268de31dd8a9c22c92c865d635c607f5df"} Apr 16 20:31:52.361633 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:52.361240 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" event={"ID":"1c238870-e428-4fc7-9548-69965d5c1f5c","Type":"ContainerStarted","Data":"65abac2ca431dc85bf918289cc53c2441058106b86e53afbe63f6a2a28a67a6a"} Apr 16 20:31:56.376300 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:56.376263 2572 generic.go:358] "Generic (PLEG): container finished" podID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerID="4c14f2edf35c0f7b2e10ba58ef3e6f268de31dd8a9c22c92c865d635c607f5df" exitCode=0 Apr 16 20:31:56.376673 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:56.376336 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" event={"ID":"1c238870-e428-4fc7-9548-69965d5c1f5c","Type":"ContainerDied","Data":"4c14f2edf35c0f7b2e10ba58ef3e6f268de31dd8a9c22c92c865d635c607f5df"} Apr 16 20:31:57.382481 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:57.382446 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" event={"ID":"1c238870-e428-4fc7-9548-69965d5c1f5c","Type":"ContainerStarted","Data":"5a049282b69ef2da71d3f9a5793846f8963dda740949237241acdaca7109e107"} Apr 16 20:31:57.403203 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:57.403146 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podStartSLOduration=6.403124509 podStartE2EDuration="6.403124509s" podCreationTimestamp="2026-04-16 20:31:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:31:57.40139606 +0000 UTC m=+1208.838384568" watchObservedRunningTime="2026-04-16 20:31:57.403124509 +0000 UTC m=+1208.840113018" Apr 16 20:31:59.664235 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:59.664176 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 16 20:31:59.682544 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:31:59.682514 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:32:01.439796 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:01.439755 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:32:01.440267 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:01.439806 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:32:01.441700 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:01.441666 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 20:32:09.664170 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:09.664108 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 16 20:32:11.439540 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:11.439491 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 20:32:19.663976 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:19.663920 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 16 20:32:21.440256 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:21.440216 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 20:32:29.664228 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:29.664125 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 16 20:32:31.439697 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:31.439645 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 20:32:39.663628 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:39.663579 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 16 20:32:41.439591 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:41.439542 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 20:32:49.663616 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:49.663570 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 16 20:32:51.439819 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:51.439771 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 20:32:59.663780 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:32:59.663726 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 16 20:33:01.439413 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:33:01.439368 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 20:33:09.664264 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:33:09.664223 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 16 20:33:11.440139 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:33:11.440096 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 20:33:19.664490 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:33:19.664441 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 16 20:33:21.440259 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:33:21.440216 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 20:33:29.663523 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:33:29.663467 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 16 20:33:31.440368 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:33:31.440310 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 20:33:39.664099 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:33:39.664051 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8001/health\": dial tcp 10.134.0.43:8001: connect: connection refused" Apr 16 20:33:41.440185 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:33:41.440134 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 20:33:49.673634 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:33:49.673598 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:33:49.692926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:33:49.692899 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:33:51.439810 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:33:51.439756 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 20:34:01.451089 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:01.450986 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:34:01.467065 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:01.467033 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:34:02.437408 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:02.437373 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g"] Apr 16 20:34:02.437884 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:02.437826 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" containerID="cri-o://349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433" gracePeriod=30 Apr 16 20:34:20.593808 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:20.593772 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n"] Apr 16 20:34:20.594287 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:20.594137 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" containerID="cri-o://5a049282b69ef2da71d3f9a5793846f8963dda740949237241acdaca7109e107" gracePeriod=30 Apr 16 20:34:24.921546 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:24.921509 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 20:34:24.926415 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:24.926395 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:24.928455 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:24.928430 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-xl9fd\"" Apr 16 20:34:24.929022 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:24.929006 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 20:34:24.935194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:24.935163 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 20:34:25.103469 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.103427 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6n5c\" (UniqueName: \"kubernetes.io/projected/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-kube-api-access-t6n5c\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.103641 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.103524 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.103641 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.103595 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.103641 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.103632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.103785 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.103661 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.103785 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.103717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.204999 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.204912 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.204999 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.204980 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.205195 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.205006 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.205195 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.205026 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.205195 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.205068 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.205195 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.205139 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6n5c\" (UniqueName: \"kubernetes.io/projected/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-kube-api-access-t6n5c\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.205344 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.205323 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.205403 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.205378 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.205541 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.205521 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.207280 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.207262 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.207364 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.207346 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.213013 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.212989 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6n5c\" (UniqueName: \"kubernetes.io/projected/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-kube-api-access-t6n5c\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.237772 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.237748 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:25.364179 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.364026 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 20:34:25.366492 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:34:25.366465 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode60b960c_2279_4ca9_b3fb_f22f2ec6f30c.slice/crio-05d9671c08ca3f2624e432d1ae5a5c696dcf778cbb0dea0cc058ac87946560c8 WatchSource:0}: Error finding container 05d9671c08ca3f2624e432d1ae5a5c696dcf778cbb0dea0cc058ac87946560c8: Status 404 returned error can't find the container with id 05d9671c08ca3f2624e432d1ae5a5c696dcf778cbb0dea0cc058ac87946560c8 Apr 16 20:34:25.955054 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.955014 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c","Type":"ContainerStarted","Data":"80e2458232e32785b982dde33f2700a533ee59b0cf53d4fc943cb88988be456a"} Apr 16 20:34:25.955054 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:25.955054 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c","Type":"ContainerStarted","Data":"05d9671c08ca3f2624e432d1ae5a5c696dcf778cbb0dea0cc058ac87946560c8"} Apr 16 20:34:29.969652 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:29.969618 2572 generic.go:358] "Generic (PLEG): container finished" podID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerID="80e2458232e32785b982dde33f2700a533ee59b0cf53d4fc943cb88988be456a" exitCode=0 Apr 16 20:34:29.970045 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:29.969670 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c","Type":"ContainerDied","Data":"80e2458232e32785b982dde33f2700a533ee59b0cf53d4fc943cb88988be456a"} Apr 16 20:34:30.974818 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:30.974781 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c","Type":"ContainerStarted","Data":"57a0a6cfc1a7bf5934cca5435987ddbcf9e287b9eb4ee42c43c56e984fe3b411"} Apr 16 20:34:30.995131 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:30.995072 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=6.995052619 podStartE2EDuration="6.995052619s" podCreationTimestamp="2026-04-16 20:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:34:30.991480509 +0000 UTC m=+1362.428469027" watchObservedRunningTime="2026-04-16 20:34:30.995052619 +0000 UTC m=+1362.432041127" Apr 16 20:34:32.438560 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.438504 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="llm-d-routing-sidecar" containerID="cri-o://71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898" gracePeriod=2 Apr 16 20:34:32.696649 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.696575 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g_cda11da4-d358-4808-bde5-06bf2a8d7ad5/main/0.log" Apr 16 20:34:32.697337 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.697316 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:34:32.786531 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.786497 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cda11da4-d358-4808-bde5-06bf2a8d7ad5-tls-certs\") pod \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " Apr 16 20:34:32.786685 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.786553 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-dshm\") pod \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " Apr 16 20:34:32.786685 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.786590 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-model-cache\") pod \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " Apr 16 20:34:32.786685 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.786624 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-home\") pod \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " Apr 16 20:34:32.786685 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.786649 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpsjl\" (UniqueName: \"kubernetes.io/projected/cda11da4-d358-4808-bde5-06bf2a8d7ad5-kube-api-access-cpsjl\") pod \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " Apr 16 20:34:32.786685 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.786682 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-kserve-provision-location\") pod \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\" (UID: \"cda11da4-d358-4808-bde5-06bf2a8d7ad5\") " Apr 16 20:34:32.787017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.786969 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-model-cache" (OuterVolumeSpecName: "model-cache") pod "cda11da4-d358-4808-bde5-06bf2a8d7ad5" (UID: "cda11da4-d358-4808-bde5-06bf2a8d7ad5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:32.787167 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.787130 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-home" (OuterVolumeSpecName: "home") pod "cda11da4-d358-4808-bde5-06bf2a8d7ad5" (UID: "cda11da4-d358-4808-bde5-06bf2a8d7ad5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:32.788972 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.788939 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda11da4-d358-4808-bde5-06bf2a8d7ad5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "cda11da4-d358-4808-bde5-06bf2a8d7ad5" (UID: "cda11da4-d358-4808-bde5-06bf2a8d7ad5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:34:32.789098 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.789053 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda11da4-d358-4808-bde5-06bf2a8d7ad5-kube-api-access-cpsjl" (OuterVolumeSpecName: "kube-api-access-cpsjl") pod "cda11da4-d358-4808-bde5-06bf2a8d7ad5" (UID: "cda11da4-d358-4808-bde5-06bf2a8d7ad5"). InnerVolumeSpecName "kube-api-access-cpsjl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:34:32.789512 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.789480 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-dshm" (OuterVolumeSpecName: "dshm") pod "cda11da4-d358-4808-bde5-06bf2a8d7ad5" (UID: "cda11da4-d358-4808-bde5-06bf2a8d7ad5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:32.851753 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.851707 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cda11da4-d358-4808-bde5-06bf2a8d7ad5" (UID: "cda11da4-d358-4808-bde5-06bf2a8d7ad5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:32.887668 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.887636 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-dshm\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:34:32.887668 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.887666 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-model-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:34:32.887668 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.887678 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-home\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:34:32.887933 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.887688 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cpsjl\" (UniqueName: \"kubernetes.io/projected/cda11da4-d358-4808-bde5-06bf2a8d7ad5-kube-api-access-cpsjl\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:34:32.887933 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.887698 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cda11da4-d358-4808-bde5-06bf2a8d7ad5-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:34:32.887933 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.887707 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cda11da4-d358-4808-bde5-06bf2a8d7ad5-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:34:32.982883 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.982788 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g_cda11da4-d358-4808-bde5-06bf2a8d7ad5/main/0.log" Apr 16 20:34:32.983537 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.983508 2572 generic.go:358] "Generic (PLEG): container finished" podID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerID="349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433" exitCode=137 Apr 16 20:34:32.983537 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.983534 2572 generic.go:358] "Generic (PLEG): container finished" podID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerID="71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898" exitCode=0 Apr 16 20:34:32.983674 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.983592 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" event={"ID":"cda11da4-d358-4808-bde5-06bf2a8d7ad5","Type":"ContainerDied","Data":"349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433"} Apr 16 20:34:32.983674 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.983605 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" Apr 16 20:34:32.983674 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.983630 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" event={"ID":"cda11da4-d358-4808-bde5-06bf2a8d7ad5","Type":"ContainerDied","Data":"71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898"} Apr 16 20:34:32.983674 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.983646 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g" event={"ID":"cda11da4-d358-4808-bde5-06bf2a8d7ad5","Type":"ContainerDied","Data":"78ce45b33fd84e6da232c9a459150f319801af371bdcb15e76f1354e47c4c79d"} Apr 16 20:34:32.983674 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:32.983661 2572 scope.go:117] "RemoveContainer" containerID="349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433" Apr 16 20:34:33.006803 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.006029 2572 scope.go:117] "RemoveContainer" containerID="207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd" Apr 16 20:34:33.009291 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.009265 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g"] Apr 16 20:34:33.012979 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.012954 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-647dc8b7b-4pv8g"] Apr 16 20:34:33.068741 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.068719 2572 scope.go:117] "RemoveContainer" containerID="71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898" Apr 16 20:34:33.076601 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.076577 2572 scope.go:117] "RemoveContainer" containerID="349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433" Apr 16 20:34:33.076907 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:34:33.076882 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433\": container with ID starting with 349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433 not found: ID does not exist" containerID="349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433" Apr 16 20:34:33.076961 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.076923 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433"} err="failed to get container status \"349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433\": rpc error: code = NotFound desc = could not find container \"349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433\": container with ID starting with 349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433 not found: ID does not exist" Apr 16 20:34:33.076961 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.076948 2572 scope.go:117] "RemoveContainer" containerID="207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd" Apr 16 20:34:33.077204 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:34:33.077189 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd\": container with ID starting with 207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd not found: ID does not exist" containerID="207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd" Apr 16 20:34:33.077252 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.077209 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd"} err="failed to get container status \"207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd\": rpc error: code = NotFound desc = could not find container \"207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd\": container with ID starting with 207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd not found: ID does not exist" Apr 16 20:34:33.077252 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.077227 2572 scope.go:117] "RemoveContainer" containerID="71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898" Apr 16 20:34:33.077433 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:34:33.077418 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898\": container with ID starting with 71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898 not found: ID does not exist" containerID="71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898" Apr 16 20:34:33.077484 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.077435 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898"} err="failed to get container status \"71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898\": rpc error: code = NotFound desc = could not find container \"71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898\": container with ID starting with 71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898 not found: ID does not exist" Apr 16 20:34:33.077484 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.077447 2572 scope.go:117] "RemoveContainer" containerID="349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433" Apr 16 20:34:33.077665 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.077640 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433"} err="failed to get container status \"349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433\": rpc error: code = NotFound desc = could not find container \"349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433\": container with ID starting with 349b7ed7e3793ae2b66001a61c4ce4e3556b45a826421871c6207e6926c66433 not found: ID does not exist" Apr 16 20:34:33.077737 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.077668 2572 scope.go:117] "RemoveContainer" containerID="207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd" Apr 16 20:34:33.077904 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.077855 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd"} err="failed to get container status \"207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd\": rpc error: code = NotFound desc = could not find container \"207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd\": container with ID starting with 207fa09738ddd1211aa3858c3b606e4243d1320c34f1901f4e07887ef6183fcd not found: ID does not exist" Apr 16 20:34:33.077904 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.077904 2572 scope.go:117] "RemoveContainer" containerID="71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898" Apr 16 20:34:33.078148 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.078133 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898"} err="failed to get container status \"71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898\": rpc error: code = NotFound desc = could not find container \"71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898\": container with ID starting with 71f9f4dbd0fdae94c96c829650c25d7600bb29d5879e4dfc5b22e719d347e898 not found: ID does not exist" Apr 16 20:34:33.149521 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:33.149477 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" path="/var/lib/kubelet/pods/cda11da4-d358-4808-bde5-06bf2a8d7ad5/volumes" Apr 16 20:34:35.238640 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:35.238603 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:35.240129 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:35.240099 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 20:34:45.239150 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:45.239102 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 20:34:50.854491 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.854423 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n_1c238870-e428-4fc7-9548-69965d5c1f5c/main/0.log" Apr 16 20:34:50.854865 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.854847 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:34:50.964375 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.964341 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-model-cache\") pod \"1c238870-e428-4fc7-9548-69965d5c1f5c\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " Apr 16 20:34:50.964571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.964392 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-kserve-provision-location\") pod \"1c238870-e428-4fc7-9548-69965d5c1f5c\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " Apr 16 20:34:50.964571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.964441 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1c238870-e428-4fc7-9548-69965d5c1f5c-tls-certs\") pod \"1c238870-e428-4fc7-9548-69965d5c1f5c\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " Apr 16 20:34:50.964571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.964469 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-home\") pod \"1c238870-e428-4fc7-9548-69965d5c1f5c\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " Apr 16 20:34:50.964571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.964540 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-dshm\") pod \"1c238870-e428-4fc7-9548-69965d5c1f5c\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " Apr 16 20:34:50.964804 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.964597 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gt4j\" (UniqueName: \"kubernetes.io/projected/1c238870-e428-4fc7-9548-69965d5c1f5c-kube-api-access-5gt4j\") pod \"1c238870-e428-4fc7-9548-69965d5c1f5c\" (UID: \"1c238870-e428-4fc7-9548-69965d5c1f5c\") " Apr 16 20:34:50.964804 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.964690 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-model-cache" (OuterVolumeSpecName: "model-cache") pod "1c238870-e428-4fc7-9548-69965d5c1f5c" (UID: "1c238870-e428-4fc7-9548-69965d5c1f5c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:50.964977 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.964945 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-home" (OuterVolumeSpecName: "home") pod "1c238870-e428-4fc7-9548-69965d5c1f5c" (UID: "1c238870-e428-4fc7-9548-69965d5c1f5c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:50.965103 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.964949 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-model-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:34:50.966792 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.966760 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c238870-e428-4fc7-9548-69965d5c1f5c-kube-api-access-5gt4j" (OuterVolumeSpecName: "kube-api-access-5gt4j") pod "1c238870-e428-4fc7-9548-69965d5c1f5c" (UID: "1c238870-e428-4fc7-9548-69965d5c1f5c"). InnerVolumeSpecName "kube-api-access-5gt4j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:34:50.967154 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.967123 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c238870-e428-4fc7-9548-69965d5c1f5c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1c238870-e428-4fc7-9548-69965d5c1f5c" (UID: "1c238870-e428-4fc7-9548-69965d5c1f5c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:34:50.967245 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:50.967140 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-dshm" (OuterVolumeSpecName: "dshm") pod "1c238870-e428-4fc7-9548-69965d5c1f5c" (UID: "1c238870-e428-4fc7-9548-69965d5c1f5c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:51.019411 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.019373 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1c238870-e428-4fc7-9548-69965d5c1f5c" (UID: "1c238870-e428-4fc7-9548-69965d5c1f5c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:51.046949 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.046925 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n_1c238870-e428-4fc7-9548-69965d5c1f5c/main/0.log" Apr 16 20:34:51.047350 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.047319 2572 generic.go:358] "Generic (PLEG): container finished" podID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerID="5a049282b69ef2da71d3f9a5793846f8963dda740949237241acdaca7109e107" exitCode=137 Apr 16 20:34:51.047424 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.047395 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" Apr 16 20:34:51.047424 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.047402 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" event={"ID":"1c238870-e428-4fc7-9548-69965d5c1f5c","Type":"ContainerDied","Data":"5a049282b69ef2da71d3f9a5793846f8963dda740949237241acdaca7109e107"} Apr 16 20:34:51.047534 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.047450 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n" event={"ID":"1c238870-e428-4fc7-9548-69965d5c1f5c","Type":"ContainerDied","Data":"65abac2ca431dc85bf918289cc53c2441058106b86e53afbe63f6a2a28a67a6a"} Apr 16 20:34:51.047534 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.047473 2572 scope.go:117] "RemoveContainer" containerID="5a049282b69ef2da71d3f9a5793846f8963dda740949237241acdaca7109e107" Apr 16 20:34:51.066113 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.066094 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-dshm\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:34:51.066203 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.066116 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5gt4j\" (UniqueName: \"kubernetes.io/projected/1c238870-e428-4fc7-9548-69965d5c1f5c-kube-api-access-5gt4j\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:34:51.066203 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.066126 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:34:51.066203 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.066137 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1c238870-e428-4fc7-9548-69965d5c1f5c-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:34:51.066203 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.066146 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1c238870-e428-4fc7-9548-69965d5c1f5c-home\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:34:51.071990 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.071974 2572 scope.go:117] "RemoveContainer" containerID="4c14f2edf35c0f7b2e10ba58ef3e6f268de31dd8a9c22c92c865d635c607f5df" Apr 16 20:34:51.084598 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.084563 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n"] Apr 16 20:34:51.086609 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.086587 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6c7fc495bczt47n"] Apr 16 20:34:51.138360 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.138335 2572 scope.go:117] "RemoveContainer" containerID="5a049282b69ef2da71d3f9a5793846f8963dda740949237241acdaca7109e107" Apr 16 20:34:51.138700 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:34:51.138671 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a049282b69ef2da71d3f9a5793846f8963dda740949237241acdaca7109e107\": container with ID starting with 5a049282b69ef2da71d3f9a5793846f8963dda740949237241acdaca7109e107 not found: ID does not exist" containerID="5a049282b69ef2da71d3f9a5793846f8963dda740949237241acdaca7109e107" Apr 16 20:34:51.138743 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.138713 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a049282b69ef2da71d3f9a5793846f8963dda740949237241acdaca7109e107"} err="failed to get container status \"5a049282b69ef2da71d3f9a5793846f8963dda740949237241acdaca7109e107\": rpc error: code = NotFound desc = could not find container \"5a049282b69ef2da71d3f9a5793846f8963dda740949237241acdaca7109e107\": container with ID starting with 5a049282b69ef2da71d3f9a5793846f8963dda740949237241acdaca7109e107 not found: ID does not exist" Apr 16 20:34:51.138743 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.138734 2572 scope.go:117] "RemoveContainer" containerID="4c14f2edf35c0f7b2e10ba58ef3e6f268de31dd8a9c22c92c865d635c607f5df" Apr 16 20:34:51.139069 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:34:51.139048 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c14f2edf35c0f7b2e10ba58ef3e6f268de31dd8a9c22c92c865d635c607f5df\": container with ID starting with 4c14f2edf35c0f7b2e10ba58ef3e6f268de31dd8a9c22c92c865d635c607f5df not found: ID does not exist" containerID="4c14f2edf35c0f7b2e10ba58ef3e6f268de31dd8a9c22c92c865d635c607f5df" Apr 16 20:34:51.139155 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.139077 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c14f2edf35c0f7b2e10ba58ef3e6f268de31dd8a9c22c92c865d635c607f5df"} err="failed to get container status \"4c14f2edf35c0f7b2e10ba58ef3e6f268de31dd8a9c22c92c865d635c607f5df\": rpc error: code = NotFound desc = could not find container \"4c14f2edf35c0f7b2e10ba58ef3e6f268de31dd8a9c22c92c865d635c607f5df\": container with ID starting with 4c14f2edf35c0f7b2e10ba58ef3e6f268de31dd8a9c22c92c865d635c607f5df not found: ID does not exist" Apr 16 20:34:51.148989 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:51.148964 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" path="/var/lib/kubelet/pods/1c238870-e428-4fc7-9548-69965d5c1f5c/volumes" Apr 16 20:34:55.238211 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:55.238167 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:34:55.238621 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:34:55.238493 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 20:35:05.238363 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:35:05.238319 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 20:35:15.238355 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:35:15.238312 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 20:35:25.238313 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:35:25.238208 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 20:35:35.238537 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:35:35.238434 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 20:35:45.238597 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:35:45.238550 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 20:35:55.238427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:35:55.238374 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 20:36:05.248404 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:05.248369 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:36:05.256045 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:05.256015 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:36:12.641912 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:12.641843 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 20:36:12.642371 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:12.642140 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="main" containerID="cri-o://57a0a6cfc1a7bf5934cca5435987ddbcf9e287b9eb4ee42c43c56e984fe3b411" gracePeriod=30 Apr 16 20:36:13.355048 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.355020 2572 generic.go:358] "Generic (PLEG): container finished" podID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerID="57a0a6cfc1a7bf5934cca5435987ddbcf9e287b9eb4ee42c43c56e984fe3b411" exitCode=0 Apr 16 20:36:13.355177 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.355090 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c","Type":"ContainerDied","Data":"57a0a6cfc1a7bf5934cca5435987ddbcf9e287b9eb4ee42c43c56e984fe3b411"} Apr 16 20:36:13.487703 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.487683 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:36:13.620388 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.620313 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-model-cache\") pod \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " Apr 16 20:36:13.620388 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.620367 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-tls-certs\") pod \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " Apr 16 20:36:13.620611 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.620409 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-dshm\") pod \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " Apr 16 20:36:13.620611 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.620440 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-kserve-provision-location\") pod \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " Apr 16 20:36:13.620611 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.620459 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6n5c\" (UniqueName: \"kubernetes.io/projected/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-kube-api-access-t6n5c\") pod \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " Apr 16 20:36:13.620611 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.620490 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-home\") pod \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\" (UID: \"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c\") " Apr 16 20:36:13.620611 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.620591 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-model-cache" (OuterVolumeSpecName: "model-cache") pod "e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" (UID: "e60b960c-2279-4ca9-b3fb-f22f2ec6f30c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:13.620912 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.620822 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-model-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:36:13.620999 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.620963 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-home" (OuterVolumeSpecName: "home") pod "e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" (UID: "e60b960c-2279-4ca9-b3fb-f22f2ec6f30c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:13.622742 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.622697 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-dshm" (OuterVolumeSpecName: "dshm") pod "e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" (UID: "e60b960c-2279-4ca9-b3fb-f22f2ec6f30c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:13.622742 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.622703 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" (UID: "e60b960c-2279-4ca9-b3fb-f22f2ec6f30c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:36:13.622893 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.622809 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-kube-api-access-t6n5c" (OuterVolumeSpecName: "kube-api-access-t6n5c") pod "e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" (UID: "e60b960c-2279-4ca9-b3fb-f22f2ec6f30c"). InnerVolumeSpecName "kube-api-access-t6n5c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:36:13.676292 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.676229 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" (UID: "e60b960c-2279-4ca9-b3fb-f22f2ec6f30c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:13.721401 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.721370 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:36:13.721401 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.721396 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t6n5c\" (UniqueName: \"kubernetes.io/projected/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-kube-api-access-t6n5c\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:36:13.721585 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.721408 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-home\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:36:13.721585 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.721418 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:36:13.721585 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:13.721427 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c-dshm\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:36:14.362817 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:14.362787 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:36:14.363017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:14.362785 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"e60b960c-2279-4ca9-b3fb-f22f2ec6f30c","Type":"ContainerDied","Data":"05d9671c08ca3f2624e432d1ae5a5c696dcf778cbb0dea0cc058ac87946560c8"} Apr 16 20:36:14.363017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:14.362926 2572 scope.go:117] "RemoveContainer" containerID="57a0a6cfc1a7bf5934cca5435987ddbcf9e287b9eb4ee42c43c56e984fe3b411" Apr 16 20:36:14.384201 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:14.384175 2572 scope.go:117] "RemoveContainer" containerID="80e2458232e32785b982dde33f2700a533ee59b0cf53d4fc943cb88988be456a" Apr 16 20:36:14.384533 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:14.384511 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 20:36:14.391068 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:14.391047 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 20:36:15.149402 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:15.149369 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" path="/var/lib/kubelet/pods/e60b960c-2279-4ca9-b3fb-f22f2ec6f30c/volumes" Apr 16 20:36:23.833395 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833332 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t"] Apr 16 20:36:23.833754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833737 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="storage-initializer" Apr 16 20:36:23.833808 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833756 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="storage-initializer" Apr 16 20:36:23.833808 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833765 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="storage-initializer" Apr 16 20:36:23.833808 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833770 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="storage-initializer" Apr 16 20:36:23.833808 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833776 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" Apr 16 20:36:23.833808 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833782 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" Apr 16 20:36:23.833808 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833799 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" Apr 16 20:36:23.833808 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833804 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" Apr 16 20:36:23.834039 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833811 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="llm-d-routing-sidecar" Apr 16 20:36:23.834039 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833816 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="llm-d-routing-sidecar" Apr 16 20:36:23.834039 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833822 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="storage-initializer" Apr 16 20:36:23.834039 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833827 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="storage-initializer" Apr 16 20:36:23.834039 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833832 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="main" Apr 16 20:36:23.834039 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833838 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="main" Apr 16 20:36:23.834039 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833902 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c238870-e428-4fc7-9548-69965d5c1f5c" containerName="main" Apr 16 20:36:23.834039 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833911 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="llm-d-routing-sidecar" Apr 16 20:36:23.834039 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833920 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e60b960c-2279-4ca9-b3fb-f22f2ec6f30c" containerName="main" Apr 16 20:36:23.834039 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.833928 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cda11da4-d358-4808-bde5-06bf2a8d7ad5" containerName="main" Apr 16 20:36:23.838537 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.838519 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:23.841541 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.841519 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:36:23.841541 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.841531 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 20:36:23.841733 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.841576 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:36:23.841733 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.841627 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:36:23.848339 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:23.848310 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t"] Apr 16 20:36:24.017957 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.017920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxkx8\" (UniqueName: \"kubernetes.io/projected/bfde196e-c8b7-4112-b493-2b077ad9c317-kube-api-access-dxkx8\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.018137 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.017992 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-dshm\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.018137 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.018069 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.018137 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.018114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-home\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.018249 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.018137 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-model-cache\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.018249 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.018220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bfde196e-c8b7-4112-b493-2b077ad9c317-tls-certs\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.094535 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.094450 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk"] Apr 16 20:36:24.097994 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.097973 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.100983 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.100964 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-w984k\"" Apr 16 20:36:24.110002 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.109977 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk"] Apr 16 20:36:24.119456 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.119434 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-dshm\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.119538 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.119475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.119538 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.119520 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-home\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.119654 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.119548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-model-cache\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.119707 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.119685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bfde196e-c8b7-4112-b493-2b077ad9c317-tls-certs\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.119782 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.119762 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxkx8\" (UniqueName: \"kubernetes.io/projected/bfde196e-c8b7-4112-b493-2b077ad9c317-kube-api-access-dxkx8\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.119942 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.119921 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-model-cache\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.119942 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.119933 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.120374 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.120348 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-home\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.121829 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.121778 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-dshm\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.122327 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.122305 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bfde196e-c8b7-4112-b493-2b077ad9c317-tls-certs\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.128497 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.128473 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxkx8\" (UniqueName: \"kubernetes.io/projected/bfde196e-c8b7-4112-b493-2b077ad9c317-kube-api-access-dxkx8\") pod \"scheduler-inline-config-test-kserve-7669d774b6-k9t8t\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.149694 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.149662 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:24.220754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.220718 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.220943 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.220785 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/945cc35d-bae8-4073-9278-3992c62bec85-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.220943 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.220813 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdt68\" (UniqueName: \"kubernetes.io/projected/945cc35d-bae8-4073-9278-3992c62bec85-kube-api-access-xdt68\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.220943 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.220913 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.221085 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.220953 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.221085 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.221048 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.271408 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.271382 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t"] Apr 16 20:36:24.273498 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:36:24.273469 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfde196e_c8b7_4112_b493_2b077ad9c317.slice/crio-0789b1d22d75d0136cb15745a80642ee90c112df709bbd1fda8f720ab5e42b03 WatchSource:0}: Error finding container 0789b1d22d75d0136cb15745a80642ee90c112df709bbd1fda8f720ab5e42b03: Status 404 returned error can't find the container with id 0789b1d22d75d0136cb15745a80642ee90c112df709bbd1fda8f720ab5e42b03 Apr 16 20:36:24.275404 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.275386 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:36:24.322230 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.322203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/945cc35d-bae8-4073-9278-3992c62bec85-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.322346 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.322238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdt68\" (UniqueName: \"kubernetes.io/projected/945cc35d-bae8-4073-9278-3992c62bec85-kube-api-access-xdt68\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.322346 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.322319 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.322469 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.322363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.322469 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.322418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.322571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.322504 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.322826 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.322804 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.322945 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.322911 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.322945 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.322933 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.323062 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.323040 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.324530 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.324510 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/945cc35d-bae8-4073-9278-3992c62bec85-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.329808 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.329788 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdt68\" (UniqueName: \"kubernetes.io/projected/945cc35d-bae8-4073-9278-3992c62bec85-kube-api-access-xdt68\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.398623 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.398541 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" event={"ID":"bfde196e-c8b7-4112-b493-2b077ad9c317","Type":"ContainerStarted","Data":"263fe29d1089621f7ef3bcbc03234073073d9f91cba4ebf20076f672b45a512c"} Apr 16 20:36:24.398623 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.398576 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" event={"ID":"bfde196e-c8b7-4112-b493-2b077ad9c317","Type":"ContainerStarted","Data":"0789b1d22d75d0136cb15745a80642ee90c112df709bbd1fda8f720ab5e42b03"} Apr 16 20:36:24.409667 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.409635 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:24.746293 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:24.746261 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk"] Apr 16 20:36:24.749341 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:36:24.749310 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod945cc35d_bae8_4073_9278_3992c62bec85.slice/crio-f9c653f2c3b0ea9fc32e2f24cf7d778e0a637c6d6513c1a510d9c821f98dfc9c WatchSource:0}: Error finding container f9c653f2c3b0ea9fc32e2f24cf7d778e0a637c6d6513c1a510d9c821f98dfc9c: Status 404 returned error can't find the container with id f9c653f2c3b0ea9fc32e2f24cf7d778e0a637c6d6513c1a510d9c821f98dfc9c Apr 16 20:36:25.405175 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:25.405135 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" event={"ID":"945cc35d-bae8-4073-9278-3992c62bec85","Type":"ContainerStarted","Data":"391f3a8e62232e4f074567ca12264079f72601b432483f7a98130a51280fd8d6"} Apr 16 20:36:25.405720 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:25.405186 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" event={"ID":"945cc35d-bae8-4073-9278-3992c62bec85","Type":"ContainerStarted","Data":"f9c653f2c3b0ea9fc32e2f24cf7d778e0a637c6d6513c1a510d9c821f98dfc9c"} Apr 16 20:36:26.410493 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:26.410455 2572 generic.go:358] "Generic (PLEG): container finished" podID="945cc35d-bae8-4073-9278-3992c62bec85" containerID="391f3a8e62232e4f074567ca12264079f72601b432483f7a98130a51280fd8d6" exitCode=0 Apr 16 20:36:26.410966 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:26.410510 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" event={"ID":"945cc35d-bae8-4073-9278-3992c62bec85","Type":"ContainerDied","Data":"391f3a8e62232e4f074567ca12264079f72601b432483f7a98130a51280fd8d6"} Apr 16 20:36:27.416635 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:27.416597 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" event={"ID":"945cc35d-bae8-4073-9278-3992c62bec85","Type":"ContainerStarted","Data":"092a9aeb11d69a4aff9b98ef6984751a76a65fd30317163e2292f03978b5604d"} Apr 16 20:36:27.416635 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:27.416642 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" event={"ID":"945cc35d-bae8-4073-9278-3992c62bec85","Type":"ContainerStarted","Data":"476398eb2031b0af8b933c25f082ffeb52186ef25d18a59c68795472b3d89764"} Apr 16 20:36:27.417058 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:27.416789 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:27.436854 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:27.436800 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" podStartSLOduration=3.436786793 podStartE2EDuration="3.436786793s" podCreationTimestamp="2026-04-16 20:36:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:36:27.435579736 +0000 UTC m=+1478.872568243" watchObservedRunningTime="2026-04-16 20:36:27.436786793 +0000 UTC m=+1478.873775301" Apr 16 20:36:29.424938 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:29.424899 2572 generic.go:358] "Generic (PLEG): container finished" podID="bfde196e-c8b7-4112-b493-2b077ad9c317" containerID="263fe29d1089621f7ef3bcbc03234073073d9f91cba4ebf20076f672b45a512c" exitCode=0 Apr 16 20:36:29.425331 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:29.424962 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" event={"ID":"bfde196e-c8b7-4112-b493-2b077ad9c317","Type":"ContainerDied","Data":"263fe29d1089621f7ef3bcbc03234073073d9f91cba4ebf20076f672b45a512c"} Apr 16 20:36:30.429954 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:30.429921 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" event={"ID":"bfde196e-c8b7-4112-b493-2b077ad9c317","Type":"ContainerStarted","Data":"351137d1ab613b942d3ef9498d1f1eb313f773eef22c7ae25d2c93c4f2337276"} Apr 16 20:36:30.448465 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:30.448410 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" podStartSLOduration=7.4483965770000005 podStartE2EDuration="7.448396577s" podCreationTimestamp="2026-04-16 20:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:36:30.446404423 +0000 UTC m=+1481.883392933" watchObservedRunningTime="2026-04-16 20:36:30.448396577 +0000 UTC m=+1481.885385084" Apr 16 20:36:34.150686 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:34.150648 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:34.150686 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:34.150693 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:34.163054 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:34.163028 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:34.410171 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:34.410078 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:34.410171 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:34.410127 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:34.412810 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:34.412787 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:34.446532 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:34.446504 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:36:34.456632 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:34.456605 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:36:49.194295 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:49.194193 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:36:49.203262 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:36:49.203239 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:37:05.450646 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:05.450569 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:37:06.991094 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:06.991059 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk"] Apr 16 20:37:06.991474 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:06.991425 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" podUID="945cc35d-bae8-4073-9278-3992c62bec85" containerName="main" containerID="cri-o://476398eb2031b0af8b933c25f082ffeb52186ef25d18a59c68795472b3d89764" gracePeriod=30 Apr 16 20:37:06.991589 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:06.991493 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" podUID="945cc35d-bae8-4073-9278-3992c62bec85" containerName="tokenizer" containerID="cri-o://092a9aeb11d69a4aff9b98ef6984751a76a65fd30317163e2292f03978b5604d" gracePeriod=30 Apr 16 20:37:06.995042 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:06.995016 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t"] Apr 16 20:37:06.995695 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:06.995662 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" podUID="bfde196e-c8b7-4112-b493-2b077ad9c317" containerName="main" containerID="cri-o://351137d1ab613b942d3ef9498d1f1eb313f773eef22c7ae25d2c93c4f2337276" gracePeriod=30 Apr 16 20:37:07.236593 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.236569 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:37:07.413311 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.413275 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxkx8\" (UniqueName: \"kubernetes.io/projected/bfde196e-c8b7-4112-b493-2b077ad9c317-kube-api-access-dxkx8\") pod \"bfde196e-c8b7-4112-b493-2b077ad9c317\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " Apr 16 20:37:07.413311 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.413317 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bfde196e-c8b7-4112-b493-2b077ad9c317-tls-certs\") pod \"bfde196e-c8b7-4112-b493-2b077ad9c317\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " Apr 16 20:37:07.413572 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.413339 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-model-cache\") pod \"bfde196e-c8b7-4112-b493-2b077ad9c317\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " Apr 16 20:37:07.413572 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.413371 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-home\") pod \"bfde196e-c8b7-4112-b493-2b077ad9c317\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " Apr 16 20:37:07.413572 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.413395 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-dshm\") pod \"bfde196e-c8b7-4112-b493-2b077ad9c317\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " Apr 16 20:37:07.413572 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.413422 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-kserve-provision-location\") pod \"bfde196e-c8b7-4112-b493-2b077ad9c317\" (UID: \"bfde196e-c8b7-4112-b493-2b077ad9c317\") " Apr 16 20:37:07.413792 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.413588 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-model-cache" (OuterVolumeSpecName: "model-cache") pod "bfde196e-c8b7-4112-b493-2b077ad9c317" (UID: "bfde196e-c8b7-4112-b493-2b077ad9c317"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:07.413792 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.413615 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-home" (OuterVolumeSpecName: "home") pod "bfde196e-c8b7-4112-b493-2b077ad9c317" (UID: "bfde196e-c8b7-4112-b493-2b077ad9c317"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:07.413792 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.413736 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-model-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:37:07.413792 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.413754 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-home\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:37:07.416127 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.416092 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfde196e-c8b7-4112-b493-2b077ad9c317-kube-api-access-dxkx8" (OuterVolumeSpecName: "kube-api-access-dxkx8") pod "bfde196e-c8b7-4112-b493-2b077ad9c317" (UID: "bfde196e-c8b7-4112-b493-2b077ad9c317"). InnerVolumeSpecName "kube-api-access-dxkx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:37:07.416127 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.416112 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-dshm" (OuterVolumeSpecName: "dshm") pod "bfde196e-c8b7-4112-b493-2b077ad9c317" (UID: "bfde196e-c8b7-4112-b493-2b077ad9c317"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:07.416315 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.416187 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfde196e-c8b7-4112-b493-2b077ad9c317-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bfde196e-c8b7-4112-b493-2b077ad9c317" (UID: "bfde196e-c8b7-4112-b493-2b077ad9c317"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:37:07.480424 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.480370 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bfde196e-c8b7-4112-b493-2b077ad9c317" (UID: "bfde196e-c8b7-4112-b493-2b077ad9c317"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:07.515072 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.515031 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dxkx8\" (UniqueName: \"kubernetes.io/projected/bfde196e-c8b7-4112-b493-2b077ad9c317-kube-api-access-dxkx8\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:37:07.515072 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.515060 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bfde196e-c8b7-4112-b493-2b077ad9c317-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:37:07.515072 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.515072 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-dshm\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:37:07.515072 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.515083 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bfde196e-c8b7-4112-b493-2b077ad9c317-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:37:07.558632 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.558594 2572 generic.go:358] "Generic (PLEG): container finished" podID="bfde196e-c8b7-4112-b493-2b077ad9c317" containerID="351137d1ab613b942d3ef9498d1f1eb313f773eef22c7ae25d2c93c4f2337276" exitCode=0 Apr 16 20:37:07.558822 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.558671 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" event={"ID":"bfde196e-c8b7-4112-b493-2b077ad9c317","Type":"ContainerDied","Data":"351137d1ab613b942d3ef9498d1f1eb313f773eef22c7ae25d2c93c4f2337276"} Apr 16 20:37:07.558822 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.558688 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" Apr 16 20:37:07.558822 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.558716 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t" event={"ID":"bfde196e-c8b7-4112-b493-2b077ad9c317","Type":"ContainerDied","Data":"0789b1d22d75d0136cb15745a80642ee90c112df709bbd1fda8f720ab5e42b03"} Apr 16 20:37:07.558822 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.558738 2572 scope.go:117] "RemoveContainer" containerID="351137d1ab613b942d3ef9498d1f1eb313f773eef22c7ae25d2c93c4f2337276" Apr 16 20:37:07.560985 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.560961 2572 generic.go:358] "Generic (PLEG): container finished" podID="945cc35d-bae8-4073-9278-3992c62bec85" containerID="476398eb2031b0af8b933c25f082ffeb52186ef25d18a59c68795472b3d89764" exitCode=0 Apr 16 20:37:07.561116 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.560987 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" event={"ID":"945cc35d-bae8-4073-9278-3992c62bec85","Type":"ContainerDied","Data":"476398eb2031b0af8b933c25f082ffeb52186ef25d18a59c68795472b3d89764"} Apr 16 20:37:07.568427 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.568413 2572 scope.go:117] "RemoveContainer" containerID="263fe29d1089621f7ef3bcbc03234073073d9f91cba4ebf20076f672b45a512c" Apr 16 20:37:07.584755 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.584724 2572 scope.go:117] "RemoveContainer" containerID="351137d1ab613b942d3ef9498d1f1eb313f773eef22c7ae25d2c93c4f2337276" Apr 16 20:37:07.585036 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:37:07.585019 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"351137d1ab613b942d3ef9498d1f1eb313f773eef22c7ae25d2c93c4f2337276\": container with ID starting with 351137d1ab613b942d3ef9498d1f1eb313f773eef22c7ae25d2c93c4f2337276 not found: ID does not exist" containerID="351137d1ab613b942d3ef9498d1f1eb313f773eef22c7ae25d2c93c4f2337276" Apr 16 20:37:07.585109 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.585043 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351137d1ab613b942d3ef9498d1f1eb313f773eef22c7ae25d2c93c4f2337276"} err="failed to get container status \"351137d1ab613b942d3ef9498d1f1eb313f773eef22c7ae25d2c93c4f2337276\": rpc error: code = NotFound desc = could not find container \"351137d1ab613b942d3ef9498d1f1eb313f773eef22c7ae25d2c93c4f2337276\": container with ID starting with 351137d1ab613b942d3ef9498d1f1eb313f773eef22c7ae25d2c93c4f2337276 not found: ID does not exist" Apr 16 20:37:07.585109 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.585064 2572 scope.go:117] "RemoveContainer" containerID="263fe29d1089621f7ef3bcbc03234073073d9f91cba4ebf20076f672b45a512c" Apr 16 20:37:07.585344 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:37:07.585325 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"263fe29d1089621f7ef3bcbc03234073073d9f91cba4ebf20076f672b45a512c\": container with ID starting with 263fe29d1089621f7ef3bcbc03234073073d9f91cba4ebf20076f672b45a512c not found: ID does not exist" containerID="263fe29d1089621f7ef3bcbc03234073073d9f91cba4ebf20076f672b45a512c" Apr 16 20:37:07.585396 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.585350 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"263fe29d1089621f7ef3bcbc03234073073d9f91cba4ebf20076f672b45a512c"} err="failed to get container status \"263fe29d1089621f7ef3bcbc03234073073d9f91cba4ebf20076f672b45a512c\": rpc error: code = NotFound desc = could not find container \"263fe29d1089621f7ef3bcbc03234073073d9f91cba4ebf20076f672b45a512c\": container with ID starting with 263fe29d1089621f7ef3bcbc03234073073d9f91cba4ebf20076f672b45a512c not found: ID does not exist" Apr 16 20:37:07.589332 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.589309 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t"] Apr 16 20:37:07.593446 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:07.593424 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7669d774b6-k9t8t"] Apr 16 20:37:08.245932 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.245909 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:37:08.424070 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.423989 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-tmp\") pod \"945cc35d-bae8-4073-9278-3992c62bec85\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " Apr 16 20:37:08.424070 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.424035 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdt68\" (UniqueName: \"kubernetes.io/projected/945cc35d-bae8-4073-9278-3992c62bec85-kube-api-access-xdt68\") pod \"945cc35d-bae8-4073-9278-3992c62bec85\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " Apr 16 20:37:08.424070 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.424059 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-cache\") pod \"945cc35d-bae8-4073-9278-3992c62bec85\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " Apr 16 20:37:08.424333 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.424091 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-uds\") pod \"945cc35d-bae8-4073-9278-3992c62bec85\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " Apr 16 20:37:08.424333 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.424117 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-kserve-provision-location\") pod \"945cc35d-bae8-4073-9278-3992c62bec85\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " Apr 16 20:37:08.424333 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.424144 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/945cc35d-bae8-4073-9278-3992c62bec85-tls-certs\") pod \"945cc35d-bae8-4073-9278-3992c62bec85\" (UID: \"945cc35d-bae8-4073-9278-3992c62bec85\") " Apr 16 20:37:08.424482 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.424389 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "945cc35d-bae8-4073-9278-3992c62bec85" (UID: "945cc35d-bae8-4073-9278-3992c62bec85"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:08.424482 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.424427 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "945cc35d-bae8-4073-9278-3992c62bec85" (UID: "945cc35d-bae8-4073-9278-3992c62bec85"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:08.424482 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.424436 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "945cc35d-bae8-4073-9278-3992c62bec85" (UID: "945cc35d-bae8-4073-9278-3992c62bec85"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:08.424645 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.424487 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:37:08.424960 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.424935 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "945cc35d-bae8-4073-9278-3992c62bec85" (UID: "945cc35d-bae8-4073-9278-3992c62bec85"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:08.426162 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.426134 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945cc35d-bae8-4073-9278-3992c62bec85-kube-api-access-xdt68" (OuterVolumeSpecName: "kube-api-access-xdt68") pod "945cc35d-bae8-4073-9278-3992c62bec85" (UID: "945cc35d-bae8-4073-9278-3992c62bec85"). InnerVolumeSpecName "kube-api-access-xdt68". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:37:08.426393 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.426371 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945cc35d-bae8-4073-9278-3992c62bec85-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "945cc35d-bae8-4073-9278-3992c62bec85" (UID: "945cc35d-bae8-4073-9278-3992c62bec85"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:37:08.525811 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.525787 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-tmp\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:37:08.525811 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.525810 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xdt68\" (UniqueName: \"kubernetes.io/projected/945cc35d-bae8-4073-9278-3992c62bec85-kube-api-access-xdt68\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:37:08.525972 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.525821 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-tokenizer-uds\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:37:08.525972 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.525830 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/945cc35d-bae8-4073-9278-3992c62bec85-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:37:08.525972 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.525840 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/945cc35d-bae8-4073-9278-3992c62bec85-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:37:08.566978 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.566952 2572 generic.go:358] "Generic (PLEG): container finished" podID="945cc35d-bae8-4073-9278-3992c62bec85" containerID="092a9aeb11d69a4aff9b98ef6984751a76a65fd30317163e2292f03978b5604d" exitCode=0 Apr 16 20:37:08.567088 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.567030 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" Apr 16 20:37:08.567088 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.567031 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" event={"ID":"945cc35d-bae8-4073-9278-3992c62bec85","Type":"ContainerDied","Data":"092a9aeb11d69a4aff9b98ef6984751a76a65fd30317163e2292f03978b5604d"} Apr 16 20:37:08.567088 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.567068 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk" event={"ID":"945cc35d-bae8-4073-9278-3992c62bec85","Type":"ContainerDied","Data":"f9c653f2c3b0ea9fc32e2f24cf7d778e0a637c6d6513c1a510d9c821f98dfc9c"} Apr 16 20:37:08.567088 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.567085 2572 scope.go:117] "RemoveContainer" containerID="092a9aeb11d69a4aff9b98ef6984751a76a65fd30317163e2292f03978b5604d" Apr 16 20:37:08.575812 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.575795 2572 scope.go:117] "RemoveContainer" containerID="476398eb2031b0af8b933c25f082ffeb52186ef25d18a59c68795472b3d89764" Apr 16 20:37:08.582914 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.582897 2572 scope.go:117] "RemoveContainer" containerID="391f3a8e62232e4f074567ca12264079f72601b432483f7a98130a51280fd8d6" Apr 16 20:37:08.590222 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.590201 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk"] Apr 16 20:37:08.590358 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.590336 2572 scope.go:117] "RemoveContainer" containerID="092a9aeb11d69a4aff9b98ef6984751a76a65fd30317163e2292f03978b5604d" Apr 16 20:37:08.590594 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:37:08.590570 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"092a9aeb11d69a4aff9b98ef6984751a76a65fd30317163e2292f03978b5604d\": container with ID starting with 092a9aeb11d69a4aff9b98ef6984751a76a65fd30317163e2292f03978b5604d not found: ID does not exist" containerID="092a9aeb11d69a4aff9b98ef6984751a76a65fd30317163e2292f03978b5604d" Apr 16 20:37:08.590655 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.590603 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092a9aeb11d69a4aff9b98ef6984751a76a65fd30317163e2292f03978b5604d"} err="failed to get container status \"092a9aeb11d69a4aff9b98ef6984751a76a65fd30317163e2292f03978b5604d\": rpc error: code = NotFound desc = could not find container \"092a9aeb11d69a4aff9b98ef6984751a76a65fd30317163e2292f03978b5604d\": container with ID starting with 092a9aeb11d69a4aff9b98ef6984751a76a65fd30317163e2292f03978b5604d not found: ID does not exist" Apr 16 20:37:08.590655 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.590626 2572 scope.go:117] "RemoveContainer" containerID="476398eb2031b0af8b933c25f082ffeb52186ef25d18a59c68795472b3d89764" Apr 16 20:37:08.590936 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:37:08.590916 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476398eb2031b0af8b933c25f082ffeb52186ef25d18a59c68795472b3d89764\": container with ID starting with 476398eb2031b0af8b933c25f082ffeb52186ef25d18a59c68795472b3d89764 not found: ID does not exist" containerID="476398eb2031b0af8b933c25f082ffeb52186ef25d18a59c68795472b3d89764" Apr 16 20:37:08.590985 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.590942 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476398eb2031b0af8b933c25f082ffeb52186ef25d18a59c68795472b3d89764"} err="failed to get container status \"476398eb2031b0af8b933c25f082ffeb52186ef25d18a59c68795472b3d89764\": rpc error: code = NotFound desc = could not find container \"476398eb2031b0af8b933c25f082ffeb52186ef25d18a59c68795472b3d89764\": container with ID starting with 476398eb2031b0af8b933c25f082ffeb52186ef25d18a59c68795472b3d89764 not found: ID does not exist" Apr 16 20:37:08.590985 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.590957 2572 scope.go:117] "RemoveContainer" containerID="391f3a8e62232e4f074567ca12264079f72601b432483f7a98130a51280fd8d6" Apr 16 20:37:08.591193 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:37:08.591172 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"391f3a8e62232e4f074567ca12264079f72601b432483f7a98130a51280fd8d6\": container with ID starting with 391f3a8e62232e4f074567ca12264079f72601b432483f7a98130a51280fd8d6 not found: ID does not exist" containerID="391f3a8e62232e4f074567ca12264079f72601b432483f7a98130a51280fd8d6" Apr 16 20:37:08.591246 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.591198 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"391f3a8e62232e4f074567ca12264079f72601b432483f7a98130a51280fd8d6"} err="failed to get container status \"391f3a8e62232e4f074567ca12264079f72601b432483f7a98130a51280fd8d6\": rpc error: code = NotFound desc = could not find container \"391f3a8e62232e4f074567ca12264079f72601b432483f7a98130a51280fd8d6\": container with ID starting with 391f3a8e62232e4f074567ca12264079f72601b432483f7a98130a51280fd8d6 not found: ID does not exist" Apr 16 20:37:08.597581 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:08.597560 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ddc4n7tvk"] Apr 16 20:37:09.149997 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:09.149959 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="945cc35d-bae8-4073-9278-3992c62bec85" path="/var/lib/kubelet/pods/945cc35d-bae8-4073-9278-3992c62bec85/volumes" Apr 16 20:37:09.150459 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:09.150446 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfde196e-c8b7-4112-b493-2b077ad9c317" path="/var/lib/kubelet/pods/bfde196e-c8b7-4112-b493-2b077ad9c317/volumes" Apr 16 20:37:20.888511 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.888476 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf"] Apr 16 20:37:20.889159 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.888973 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="945cc35d-bae8-4073-9278-3992c62bec85" containerName="tokenizer" Apr 16 20:37:20.889159 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.888994 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="945cc35d-bae8-4073-9278-3992c62bec85" containerName="tokenizer" Apr 16 20:37:20.889159 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.889010 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bfde196e-c8b7-4112-b493-2b077ad9c317" containerName="storage-initializer" Apr 16 20:37:20.889159 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.889020 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfde196e-c8b7-4112-b493-2b077ad9c317" containerName="storage-initializer" Apr 16 20:37:20.889159 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.889031 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="945cc35d-bae8-4073-9278-3992c62bec85" containerName="storage-initializer" Apr 16 20:37:20.889159 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.889037 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="945cc35d-bae8-4073-9278-3992c62bec85" containerName="storage-initializer" Apr 16 20:37:20.889159 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.889050 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="945cc35d-bae8-4073-9278-3992c62bec85" containerName="main" Apr 16 20:37:20.889159 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.889058 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="945cc35d-bae8-4073-9278-3992c62bec85" containerName="main" Apr 16 20:37:20.889159 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.889093 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bfde196e-c8b7-4112-b493-2b077ad9c317" containerName="main" Apr 16 20:37:20.889159 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.889101 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfde196e-c8b7-4112-b493-2b077ad9c317" containerName="main" Apr 16 20:37:20.889553 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.889177 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="945cc35d-bae8-4073-9278-3992c62bec85" containerName="tokenizer" Apr 16 20:37:20.889553 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.889190 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bfde196e-c8b7-4112-b493-2b077ad9c317" containerName="main" Apr 16 20:37:20.889553 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.889201 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="945cc35d-bae8-4073-9278-3992c62bec85" containerName="main" Apr 16 20:37:20.892290 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.892266 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:20.895691 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.895666 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 20:37:20.895831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.895666 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-qkhsk\"" Apr 16 20:37:20.895831 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.895734 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:37:20.896177 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.896161 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:37:20.926891 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.925080 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf"] Apr 16 20:37:20.930445 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.930411 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:20.930568 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.930465 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2b96d7b9-1051-445b-b781-74ad3279c03a-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:20.930568 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.930517 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:20.930568 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.930552 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:20.930724 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.930576 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2b96d7b9-1051-445b-b781-74ad3279c03a-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:20.930724 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.930604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2b96d7b9-1051-445b-b781-74ad3279c03a-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:20.930794 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.930714 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:20.930794 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.930780 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:20.930866 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:20.930812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97w2c\" (UniqueName: \"kubernetes.io/projected/2b96d7b9-1051-445b-b781-74ad3279c03a-kube-api-access-97w2c\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.032179 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.032143 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.032179 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.032182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97w2c\" (UniqueName: \"kubernetes.io/projected/2b96d7b9-1051-445b-b781-74ad3279c03a-kube-api-access-97w2c\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.032436 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.032212 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.032436 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.032247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2b96d7b9-1051-445b-b781-74ad3279c03a-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.032436 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.032294 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.032436 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.032322 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.032436 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.032393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2b96d7b9-1051-445b-b781-74ad3279c03a-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.032706 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.032444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2b96d7b9-1051-445b-b781-74ad3279c03a-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.032706 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.032521 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.032706 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.032596 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.032706 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.032610 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.032845 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.032730 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.032930 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.032853 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.033174 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.033157 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2b96d7b9-1051-445b-b781-74ad3279c03a-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.034707 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.034681 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2b96d7b9-1051-445b-b781-74ad3279c03a-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.034810 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.034692 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2b96d7b9-1051-445b-b781-74ad3279c03a-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.040049 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.040017 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2b96d7b9-1051-445b-b781-74ad3279c03a-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.040221 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.040205 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97w2c\" (UniqueName: \"kubernetes.io/projected/2b96d7b9-1051-445b-b781-74ad3279c03a-kube-api-access-97w2c\") pod \"router-gateway-2-openshift-default-6866b85949-wl5qf\" (UID: \"2b96d7b9-1051-445b-b781-74ad3279c03a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.221449 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.221358 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:21.557390 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.557363 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf"] Apr 16 20:37:21.559830 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:37:21.559794 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b96d7b9_1051_445b_b781_74ad3279c03a.slice/crio-a9f59a355f274bf37c0e26c0f9e49c1d6d59701df5b2371dee17ed908a37e4c2 WatchSource:0}: Error finding container a9f59a355f274bf37c0e26c0f9e49c1d6d59701df5b2371dee17ed908a37e4c2: Status 404 returned error can't find the container with id a9f59a355f274bf37c0e26c0f9e49c1d6d59701df5b2371dee17ed908a37e4c2 Apr 16 20:37:21.611308 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:21.611272 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" event={"ID":"2b96d7b9-1051-445b-b781-74ad3279c03a","Type":"ContainerStarted","Data":"a9f59a355f274bf37c0e26c0f9e49c1d6d59701df5b2371dee17ed908a37e4c2"} Apr 16 20:37:24.722983 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:24.722932 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:37:24.723268 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:24.723012 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:37:24.723268 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:24.723063 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:37:25.629750 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:25.629716 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" event={"ID":"2b96d7b9-1051-445b-b781-74ad3279c03a","Type":"ContainerStarted","Data":"223c7ee587abfe600797c54f707667c2f6f9bc63f019c7482d4cf1600ef37f38"} Apr 16 20:37:25.649656 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:25.648915 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" podStartSLOduration=2.487887412 podStartE2EDuration="5.648898035s" podCreationTimestamp="2026-04-16 20:37:20 +0000 UTC" firstStartedPulling="2026-04-16 20:37:21.561658152 +0000 UTC m=+1532.998646637" lastFinishedPulling="2026-04-16 20:37:24.722668772 +0000 UTC m=+1536.159657260" observedRunningTime="2026-04-16 20:37:25.646525986 +0000 UTC m=+1537.083514494" watchObservedRunningTime="2026-04-16 20:37:25.648898035 +0000 UTC m=+1537.085886543" Apr 16 20:37:26.222072 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:26.222034 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:26.223647 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:26.223615 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" podUID="2b96d7b9-1051-445b-b781-74ad3279c03a" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.48:15021/healthz/ready\": dial tcp 10.134.0.48:15021: connect: connection refused" Apr 16 20:37:27.221796 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:27.221752 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" podUID="2b96d7b9-1051-445b-b781-74ad3279c03a" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.48:15021/healthz/ready\": dial tcp 10.134.0.48:15021: connect: connection refused" Apr 16 20:37:28.226335 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:28.226300 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:28.638514 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:28.638483 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:28.639352 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:28.639327 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wl5qf" Apr 16 20:37:42.649292 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.649253 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb"] Apr 16 20:37:42.653064 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.653042 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.657112 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.657085 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-cqtdx\"" Apr 16 20:37:42.657112 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.657099 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 20:37:42.657339 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.657168 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-s7skk\"" Apr 16 20:37:42.658643 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.658604 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h"] Apr 16 20:37:42.662114 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.662098 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.667287 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.667264 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb"] Apr 16 20:37:42.672794 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.672767 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h"] Apr 16 20:37:42.732005 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.731972 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-home\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.732005 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.732016 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.732247 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.732040 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-model-cache\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.732247 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.732160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-home\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.732247 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.732205 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.732247 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.732233 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0afae-0f58-43cc-ae75-2f48461c5521-tls-certs\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.732440 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.732269 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.732440 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.732337 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.732440 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.732362 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-dshm\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.732440 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.732406 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/02b9605c-659b-4584-bb72-da8846082a5e-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.732587 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.732438 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz82d\" (UniqueName: \"kubernetes.io/projected/87f0afae-0f58-43cc-ae75-2f48461c5521-kube-api-access-kz82d\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.732587 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.732464 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9pwh\" (UniqueName: \"kubernetes.io/projected/02b9605c-659b-4584-bb72-da8846082a5e-kube-api-access-v9pwh\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.833026 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.832990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/02b9605c-659b-4584-bb72-da8846082a5e-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.833026 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kz82d\" (UniqueName: \"kubernetes.io/projected/87f0afae-0f58-43cc-ae75-2f48461c5521-kube-api-access-kz82d\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.833277 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833144 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9pwh\" (UniqueName: \"kubernetes.io/projected/02b9605c-659b-4584-bb72-da8846082a5e-kube-api-access-v9pwh\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.833277 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833215 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-home\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.833277 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833255 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.833468 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833294 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-model-cache\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.833536 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833500 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-home\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.833619 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.833619 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0afae-0f58-43cc-ae75-2f48461c5521-tls-certs\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.833754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833641 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.833754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833665 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.833754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.833754 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833726 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-home\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.834030 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833729 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-dshm\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.834030 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833766 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-model-cache\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.834030 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.833920 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-home\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.834194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.834130 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.834194 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.834184 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.836230 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.836204 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-dshm\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.836375 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.836215 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.836375 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.836318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/02b9605c-659b-4584-bb72-da8846082a5e-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.836572 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.836450 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0afae-0f58-43cc-ae75-2f48461c5521-tls-certs\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.842444 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.842422 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9pwh\" (UniqueName: \"kubernetes.io/projected/02b9605c-659b-4584-bb72-da8846082a5e-kube-api-access-v9pwh\") pod \"router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:42.842577 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.842558 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz82d\" (UniqueName: \"kubernetes.io/projected/87f0afae-0f58-43cc-ae75-2f48461c5521-kube-api-access-kz82d\") pod \"router-with-refs-pd-test-kserve-64dd4fd657-4vfjb\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.962937 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.962829 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:42.974977 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:42.974951 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:43.106139 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:43.106084 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb"] Apr 16 20:37:43.108294 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:37:43.108264 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87f0afae_0f58_43cc_ae75_2f48461c5521.slice/crio-5f4139eaa8fd6eec4e39111e3d57ebe3b08e43ca0633839cbec4ed049fb495f2 WatchSource:0}: Error finding container 5f4139eaa8fd6eec4e39111e3d57ebe3b08e43ca0633839cbec4ed049fb495f2: Status 404 returned error can't find the container with id 5f4139eaa8fd6eec4e39111e3d57ebe3b08e43ca0633839cbec4ed049fb495f2 Apr 16 20:37:43.133301 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:43.129106 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h"] Apr 16 20:37:43.688937 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:43.688901 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" event={"ID":"87f0afae-0f58-43cc-ae75-2f48461c5521","Type":"ContainerStarted","Data":"b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8"} Apr 16 20:37:43.689394 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:43.688944 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" event={"ID":"87f0afae-0f58-43cc-ae75-2f48461c5521","Type":"ContainerStarted","Data":"5f4139eaa8fd6eec4e39111e3d57ebe3b08e43ca0633839cbec4ed049fb495f2"} Apr 16 20:37:43.689394 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:43.689012 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:43.690488 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:43.690461 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" event={"ID":"02b9605c-659b-4584-bb72-da8846082a5e","Type":"ContainerStarted","Data":"0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446"} Apr 16 20:37:43.690583 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:43.690490 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" event={"ID":"02b9605c-659b-4584-bb72-da8846082a5e","Type":"ContainerStarted","Data":"d3e9e7358679430cae0c5972a34cbfd398451dfeb465a92d28fedd31c2b26fea"} Apr 16 20:37:44.699626 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:44.699587 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" event={"ID":"87f0afae-0f58-43cc-ae75-2f48461c5521","Type":"ContainerStarted","Data":"64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4"} Apr 16 20:37:47.712630 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:47.712595 2572 generic.go:358] "Generic (PLEG): container finished" podID="02b9605c-659b-4584-bb72-da8846082a5e" containerID="0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446" exitCode=0 Apr 16 20:37:47.713015 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:47.712641 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" event={"ID":"02b9605c-659b-4584-bb72-da8846082a5e","Type":"ContainerDied","Data":"0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446"} Apr 16 20:37:48.717844 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:48.717805 2572 generic.go:358] "Generic (PLEG): container finished" podID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerID="64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4" exitCode=0 Apr 16 20:37:48.718428 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:48.717888 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" event={"ID":"87f0afae-0f58-43cc-ae75-2f48461c5521","Type":"ContainerDied","Data":"64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4"} Apr 16 20:37:48.719801 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:48.719771 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" event={"ID":"02b9605c-659b-4584-bb72-da8846082a5e","Type":"ContainerStarted","Data":"cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2"} Apr 16 20:37:48.755890 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:48.755813 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podStartSLOduration=6.755793242 podStartE2EDuration="6.755793242s" podCreationTimestamp="2026-04-16 20:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:37:48.754696069 +0000 UTC m=+1560.191684578" watchObservedRunningTime="2026-04-16 20:37:48.755793242 +0000 UTC m=+1560.192781751" Apr 16 20:37:49.726379 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:49.726343 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" event={"ID":"87f0afae-0f58-43cc-ae75-2f48461c5521","Type":"ContainerStarted","Data":"9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c"} Apr 16 20:37:49.751421 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:49.751356 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podStartSLOduration=7.751336793 podStartE2EDuration="7.751336793s" podCreationTimestamp="2026-04-16 20:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:37:49.747583116 +0000 UTC m=+1561.184571623" watchObservedRunningTime="2026-04-16 20:37:49.751336793 +0000 UTC m=+1561.188325302" Apr 16 20:37:52.963587 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:52.963549 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:52.964059 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:52.963604 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:37:52.965246 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:52.965212 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 16 20:37:52.975434 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:52.975411 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:52.975531 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:52.975454 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:37:52.976780 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:37:52.976751 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 20:38:02.963835 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:38:02.963786 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 16 20:38:02.976142 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:38:02.976097 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 20:38:02.981223 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:38:02.981201 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:38:12.963775 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:38:12.963718 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 16 20:38:12.975382 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:38:12.975345 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 20:38:22.963465 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:38:22.963418 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 16 20:38:22.975280 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:38:22.975242 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 20:38:32.963917 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:38:32.963796 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 16 20:38:32.976026 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:38:32.975983 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 20:38:42.963820 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:38:42.963763 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 16 20:38:42.976137 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:38:42.976093 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 20:38:52.963755 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:38:52.963694 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 16 20:38:52.975420 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:38:52.975378 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 20:39:02.963889 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:39:02.963832 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 16 20:39:02.975270 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:39:02.975239 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 20:39:12.963721 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:39:12.963667 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 16 20:39:12.976206 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:39:12.976157 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 20:39:22.963689 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:39:22.963638 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 16 20:39:22.976008 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:39:22.975967 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 20:39:32.964024 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:39:32.963976 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 16 20:39:32.975926 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:39:32.975865 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 20:39:42.963639 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:39:42.963581 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 16 20:39:42.976538 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:39:42.976477 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 20:39:52.964163 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:39:52.964101 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 16 20:39:52.975456 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:39:52.975417 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 20:40:02.976948 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:40:02.976854 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:40:02.985709 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:40:02.985683 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:40:02.989599 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:40:02.989579 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:40:02.993807 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:40:02.993789 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:41:49.220529 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:41:49.220427 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:41:49.230790 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:41:49.230768 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:44:18.332290 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:18.332193 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h"] Apr 16 20:44:18.332941 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:18.332654 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" containerID="cri-o://cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2" gracePeriod=30 Apr 16 20:44:18.336116 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:18.336089 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb"] Apr 16 20:44:18.336515 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:18.336478 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" containerID="cri-o://9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c" gracePeriod=30 Apr 16 20:44:33.586353 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:33.586318 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:33.624860 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:33.624828 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:33.634539 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:33.634511 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:33.644516 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:33.644487 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:33.666739 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:33.666709 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:33.674599 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:33.674570 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:34.593229 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:34.593199 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:34.618278 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:34.618232 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:34.628826 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:34.628798 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:34.638133 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:34.638112 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:34.659017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:34.658983 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:34.666277 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:34.666251 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:35.624265 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:35.624231 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:35.651043 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:35.651011 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:35.658736 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:35.658708 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:35.667174 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:35.667153 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:35.686731 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:35.686698 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:35.693030 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:35.693007 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:36.607413 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:36.607385 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:36.634251 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:36.634219 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:36.642157 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:36.642133 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:36.651517 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:36.651496 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:36.671505 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:36.671473 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:36.678037 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:36.678017 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:37.599827 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:37.599784 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:37.628019 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:37.627986 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:37.636043 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:37.636014 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:37.645284 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:37.645251 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:37.665616 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:37.665589 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:37.673148 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:37.673117 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:38.593030 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:38.592997 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:38.618843 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:38.618819 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:38.627038 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:38.627015 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:38.637321 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:38.637299 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:38.658414 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:38.658379 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:38.666161 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:38.666142 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:39.594846 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:39.594808 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:39.620616 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:39.620586 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:39.628804 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:39.628775 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:39.637284 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:39.637256 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:39.658143 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:39.658113 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:39.664196 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:39.664175 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:40.617783 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:40.617741 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:40.646668 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:40.646637 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:40.654604 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:40.654572 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:40.664425 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:40.664402 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:40.685507 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:40.685482 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:40.696843 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:40.696818 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:41.716682 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:41.716649 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:41.741459 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:41.741426 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:41.750375 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:41.750349 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:41.759861 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:41.759838 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:41.784523 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:41.784500 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:41.793679 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:41.793657 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:42.766536 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:42.766483 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:42.791672 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:42.791641 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:42.799974 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:42.799943 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:42.808941 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:42.808917 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:42.829813 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:42.829790 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:42.836315 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:42.836299 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:43.766995 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:43.766963 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:43.792524 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:43.792495 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:43.800605 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:43.800574 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:43.809954 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:43.809897 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:43.830464 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:43.830437 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:43.836539 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:43.836518 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:44.800251 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:44.800217 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:44.824298 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:44.824273 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:44.832291 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:44.832265 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:44.841180 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:44.841159 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:44.860824 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:44.860797 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:44.868034 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:44.868015 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:45.823693 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:45.823664 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:45.848318 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:45.848289 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:45.857510 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:45.857483 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:45.865912 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:45.865867 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:45.897930 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:45.897897 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:45.905215 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:45.905191 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:46.899636 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:46.899608 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wl5qf_2b96d7b9-1051-445b-b781-74ad3279c03a/istio-proxy/0.log" Apr 16 20:44:46.927492 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:46.927456 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:46.935794 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:46.935770 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/llm-d-routing-sidecar/0.log" Apr 16 20:44:46.946072 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:46.946048 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/storage-initializer/0.log" Apr 16 20:44:46.967388 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:46.967365 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/main/0.log" Apr 16 20:44:46.974470 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:46.974439 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_02b9605c-659b-4584-bb72-da8846082a5e/storage-initializer/0.log" Apr 16 20:44:48.336809 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.336765 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="llm-d-routing-sidecar" containerID="cri-o://b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8" gracePeriod=2 Apr 16 20:44:48.636569 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.636549 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:48.637297 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.637279 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:44:48.640204 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.640187 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:44:48.754505 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.754471 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0afae-0f58-43cc-ae75-2f48461c5521-tls-certs\") pod \"87f0afae-0f58-43cc-ae75-2f48461c5521\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " Apr 16 20:44:48.754698 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.754520 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-kserve-provision-location\") pod \"02b9605c-659b-4584-bb72-da8846082a5e\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " Apr 16 20:44:48.754698 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.754541 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-dshm\") pod \"87f0afae-0f58-43cc-ae75-2f48461c5521\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " Apr 16 20:44:48.754698 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.754565 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9pwh\" (UniqueName: \"kubernetes.io/projected/02b9605c-659b-4584-bb72-da8846082a5e-kube-api-access-v9pwh\") pod \"02b9605c-659b-4584-bb72-da8846082a5e\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " Apr 16 20:44:48.754698 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.754595 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-home\") pod \"02b9605c-659b-4584-bb72-da8846082a5e\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " Apr 16 20:44:48.754698 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.754632 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-home\") pod \"87f0afae-0f58-43cc-ae75-2f48461c5521\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " Apr 16 20:44:48.754698 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.754646 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz82d\" (UniqueName: \"kubernetes.io/projected/87f0afae-0f58-43cc-ae75-2f48461c5521-kube-api-access-kz82d\") pod \"87f0afae-0f58-43cc-ae75-2f48461c5521\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " Apr 16 20:44:48.754698 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.754679 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-model-cache\") pod \"02b9605c-659b-4584-bb72-da8846082a5e\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " Apr 16 20:44:48.755111 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.754709 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-model-cache\") pod \"87f0afae-0f58-43cc-ae75-2f48461c5521\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " Apr 16 20:44:48.755111 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.754735 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/02b9605c-659b-4584-bb72-da8846082a5e-tls-certs\") pod \"02b9605c-659b-4584-bb72-da8846082a5e\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " Apr 16 20:44:48.755111 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.754781 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-kserve-provision-location\") pod \"87f0afae-0f58-43cc-ae75-2f48461c5521\" (UID: \"87f0afae-0f58-43cc-ae75-2f48461c5521\") " Apr 16 20:44:48.755111 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.754806 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-dshm\") pod \"02b9605c-659b-4584-bb72-da8846082a5e\" (UID: \"02b9605c-659b-4584-bb72-da8846082a5e\") " Apr 16 20:44:48.755111 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.755022 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-home" (OuterVolumeSpecName: "home") pod "02b9605c-659b-4584-bb72-da8846082a5e" (UID: "02b9605c-659b-4584-bb72-da8846082a5e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:48.755111 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.755023 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-home" (OuterVolumeSpecName: "home") pod "87f0afae-0f58-43cc-ae75-2f48461c5521" (UID: "87f0afae-0f58-43cc-ae75-2f48461c5521"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:48.755416 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.755165 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-model-cache" (OuterVolumeSpecName: "model-cache") pod "87f0afae-0f58-43cc-ae75-2f48461c5521" (UID: "87f0afae-0f58-43cc-ae75-2f48461c5521"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:48.755416 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.755200 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-home\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:44:48.755416 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.755216 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-home\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:44:48.755860 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.755828 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-model-cache" (OuterVolumeSpecName: "model-cache") pod "02b9605c-659b-4584-bb72-da8846082a5e" (UID: "02b9605c-659b-4584-bb72-da8846082a5e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:48.757378 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.757349 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-dshm" (OuterVolumeSpecName: "dshm") pod "87f0afae-0f58-43cc-ae75-2f48461c5521" (UID: "87f0afae-0f58-43cc-ae75-2f48461c5521"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:48.757510 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.757472 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02b9605c-659b-4584-bb72-da8846082a5e-kube-api-access-v9pwh" (OuterVolumeSpecName: "kube-api-access-v9pwh") pod "02b9605c-659b-4584-bb72-da8846082a5e" (UID: "02b9605c-659b-4584-bb72-da8846082a5e"). InnerVolumeSpecName "kube-api-access-v9pwh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:44:48.757585 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.757560 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f0afae-0f58-43cc-ae75-2f48461c5521-kube-api-access-kz82d" (OuterVolumeSpecName: "kube-api-access-kz82d") pod "87f0afae-0f58-43cc-ae75-2f48461c5521" (UID: "87f0afae-0f58-43cc-ae75-2f48461c5521"). InnerVolumeSpecName "kube-api-access-kz82d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:44:48.757697 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.757671 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0afae-0f58-43cc-ae75-2f48461c5521-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "87f0afae-0f58-43cc-ae75-2f48461c5521" (UID: "87f0afae-0f58-43cc-ae75-2f48461c5521"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:44:48.757838 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.757811 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-dshm" (OuterVolumeSpecName: "dshm") pod "02b9605c-659b-4584-bb72-da8846082a5e" (UID: "02b9605c-659b-4584-bb72-da8846082a5e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:48.758740 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.758721 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02b9605c-659b-4584-bb72-da8846082a5e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "02b9605c-659b-4584-bb72-da8846082a5e" (UID: "02b9605c-659b-4584-bb72-da8846082a5e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:44:48.816835 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.816777 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87f0afae-0f58-43cc-ae75-2f48461c5521" (UID: "87f0afae-0f58-43cc-ae75-2f48461c5521"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:48.817441 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.817416 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "02b9605c-659b-4584-bb72-da8846082a5e" (UID: "02b9605c-659b-4584-bb72-da8846082a5e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:48.856591 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.856560 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kz82d\" (UniqueName: \"kubernetes.io/projected/87f0afae-0f58-43cc-ae75-2f48461c5521-kube-api-access-kz82d\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:44:48.856591 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.856589 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-model-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:44:48.856591 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.856599 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-model-cache\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:44:48.856903 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.856608 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/02b9605c-659b-4584-bb72-da8846082a5e-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:44:48.856903 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.856617 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:44:48.856903 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.856625 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-dshm\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:44:48.856903 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.856635 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0afae-0f58-43cc-ae75-2f48461c5521-tls-certs\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:44:48.856903 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.856645 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02b9605c-659b-4584-bb72-da8846082a5e-kserve-provision-location\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:44:48.856903 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.856653 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/87f0afae-0f58-43cc-ae75-2f48461c5521-dshm\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:44:48.856903 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:48.856661 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v9pwh\" (UniqueName: \"kubernetes.io/projected/02b9605c-659b-4584-bb72-da8846082a5e-kube-api-access-v9pwh\") on node \"ip-10-0-138-62.ec2.internal\" DevicePath \"\"" Apr 16 20:44:49.234672 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.234641 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_87f0afae-0f58-43cc-ae75-2f48461c5521/main/0.log" Apr 16 20:44:49.235262 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.235237 2572 generic.go:358] "Generic (PLEG): container finished" podID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerID="9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c" exitCode=137 Apr 16 20:44:49.235262 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.235261 2572 generic.go:358] "Generic (PLEG): container finished" podID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerID="b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8" exitCode=0 Apr 16 20:44:49.235442 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.235272 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" event={"ID":"87f0afae-0f58-43cc-ae75-2f48461c5521","Type":"ContainerDied","Data":"9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c"} Apr 16 20:44:49.235442 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.235316 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" event={"ID":"87f0afae-0f58-43cc-ae75-2f48461c5521","Type":"ContainerDied","Data":"b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8"} Apr 16 20:44:49.235442 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.235332 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" event={"ID":"87f0afae-0f58-43cc-ae75-2f48461c5521","Type":"ContainerDied","Data":"5f4139eaa8fd6eec4e39111e3d57ebe3b08e43ca0633839cbec4ed049fb495f2"} Apr 16 20:44:49.235442 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.235345 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb" Apr 16 20:44:49.235442 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.235352 2572 scope.go:117] "RemoveContainer" containerID="9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c" Apr 16 20:44:49.237026 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.237003 2572 generic.go:358] "Generic (PLEG): container finished" podID="02b9605c-659b-4584-bb72-da8846082a5e" containerID="cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2" exitCode=137 Apr 16 20:44:49.237136 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.237093 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" event={"ID":"02b9605c-659b-4584-bb72-da8846082a5e","Type":"ContainerDied","Data":"cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2"} Apr 16 20:44:49.237136 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.237109 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" Apr 16 20:44:49.237367 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.237113 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h" event={"ID":"02b9605c-659b-4584-bb72-da8846082a5e","Type":"ContainerDied","Data":"d3e9e7358679430cae0c5972a34cbfd398451dfeb465a92d28fedd31c2b26fea"} Apr 16 20:44:49.256571 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.256551 2572 scope.go:117] "RemoveContainer" containerID="64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4" Apr 16 20:44:49.256793 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.256771 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb"] Apr 16 20:44:49.259807 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.259781 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64dd4fd657-4vfjb"] Apr 16 20:44:49.265613 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.265593 2572 scope.go:117] "RemoveContainer" containerID="64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4" Apr 16 20:44:49.269187 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.269163 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h"] Apr 16 20:44:49.274951 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.274927 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h"] Apr 16 20:44:49.321944 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.321921 2572 scope.go:117] "RemoveContainer" containerID="b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8" Apr 16 20:44:49.322040 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:44:49.321909 2572 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_kserve-ci-e2e-test_87f0afae-0f58-43cc-ae75-2f48461c5521_0 in pod sandbox 5f4139eaa8fd6eec4e39111e3d57ebe3b08e43ca0633839cbec4ed049fb495f2 from index: no such id: '64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4'" containerID="64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4" Apr 16 20:44:49.322040 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:44:49.322021 2572 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_router-with-refs-pd-test-kserve-64dd4fd657-4vfjb_kserve-ci-e2e-test_87f0afae-0f58-43cc-ae75-2f48461c5521_0 in pod sandbox 5f4139eaa8fd6eec4e39111e3d57ebe3b08e43ca0633839cbec4ed049fb495f2 from index: no such id: '64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4'" containerID="64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4" Apr 16 20:44:49.322152 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.322048 2572 scope.go:117] "RemoveContainer" containerID="cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2" Apr 16 20:44:49.329743 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.329723 2572 scope.go:117] "RemoveContainer" containerID="9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c" Apr 16 20:44:49.330040 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:44:49.330020 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c\": container with ID starting with 9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c not found: ID does not exist" containerID="9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c" Apr 16 20:44:49.330099 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.330050 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c"} err="failed to get container status \"9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c\": rpc error: code = NotFound desc = could not find container \"9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c\": container with ID starting with 9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c not found: ID does not exist" Apr 16 20:44:49.330099 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.330068 2572 scope.go:117] "RemoveContainer" containerID="64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4" Apr 16 20:44:49.330336 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:44:49.330319 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4\": container with ID starting with 64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4 not found: ID does not exist" containerID="64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4" Apr 16 20:44:49.330388 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.330339 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4"} err="failed to get container status \"64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4\": rpc error: code = NotFound desc = could not find container \"64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4\": container with ID starting with 64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4 not found: ID does not exist" Apr 16 20:44:49.330388 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.330352 2572 scope.go:117] "RemoveContainer" containerID="b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8" Apr 16 20:44:49.330598 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:44:49.330580 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8\": container with ID starting with b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8 not found: ID does not exist" containerID="b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8" Apr 16 20:44:49.330646 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.330603 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8"} err="failed to get container status \"b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8\": rpc error: code = NotFound desc = could not find container \"b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8\": container with ID starting with b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8 not found: ID does not exist" Apr 16 20:44:49.330646 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.330624 2572 scope.go:117] "RemoveContainer" containerID="9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c" Apr 16 20:44:49.330887 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.330856 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c"} err="failed to get container status \"9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c\": rpc error: code = NotFound desc = could not find container \"9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c\": container with ID starting with 9c569144e7a9eb3b3a65d96ec8cb724b4fb1cbbb7d38b31e1e86a9f6e1b6845c not found: ID does not exist" Apr 16 20:44:49.330939 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.330886 2572 scope.go:117] "RemoveContainer" containerID="64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4" Apr 16 20:44:49.331105 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.331088 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4"} err="failed to get container status \"64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4\": rpc error: code = NotFound desc = could not find container \"64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4\": container with ID starting with 64d7b4d2991b068ff2db5c0a6abb42e65ae379ec7df94af515158fdb84891af4 not found: ID does not exist" Apr 16 20:44:49.331160 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.331105 2572 scope.go:117] "RemoveContainer" containerID="b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8" Apr 16 20:44:49.331320 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.331301 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8"} err="failed to get container status \"b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8\": rpc error: code = NotFound desc = could not find container \"b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8\": container with ID starting with b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8 not found: ID does not exist" Apr 16 20:44:49.331320 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.331318 2572 scope.go:117] "RemoveContainer" containerID="cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2" Apr 16 20:44:49.343201 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.343009 2572 scope.go:117] "RemoveContainer" containerID="0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446" Apr 16 20:44:49.343201 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:44:49.343095 2572 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_main_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_kserve-ci-e2e-test_02b9605c-659b-4584-bb72-da8846082a5e_0 in pod sandbox d3e9e7358679430cae0c5972a34cbfd398451dfeb465a92d28fedd31c2b26fea from index: no such id: 'cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2'" containerID="cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2" Apr 16 20:44:49.343201 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.343128 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2"} err="rpc error: code = Unknown desc = failed to delete container k8s_main_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_kserve-ci-e2e-test_02b9605c-659b-4584-bb72-da8846082a5e_0 in pod sandbox d3e9e7358679430cae0c5972a34cbfd398451dfeb465a92d28fedd31c2b26fea from index: no such id: 'cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2'" Apr 16 20:44:49.343201 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.343151 2572 scope.go:117] "RemoveContainer" containerID="0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446" Apr 16 20:44:49.415115 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.415073 2572 scope.go:117] "RemoveContainer" containerID="b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8" Apr 16 20:44:49.415189 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:44:49.415060 2572 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_kserve-ci-e2e-test_02b9605c-659b-4584-bb72-da8846082a5e_0 in pod sandbox d3e9e7358679430cae0c5972a34cbfd398451dfeb465a92d28fedd31c2b26fea from index: no such id: '0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446'" containerID="0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446" Apr 16 20:44:49.415189 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.415172 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_router-with-refs-pd-test-kserve-prefill-5868f67c78-5ss6h_kserve-ci-e2e-test_02b9605c-659b-4584-bb72-da8846082a5e_0 in pod sandbox d3e9e7358679430cae0c5972a34cbfd398451dfeb465a92d28fedd31c2b26fea from index: no such id: '0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446'" Apr 16 20:44:49.415277 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.415192 2572 scope.go:117] "RemoveContainer" containerID="cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2" Apr 16 20:44:49.415445 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:44:49.415416 2572 kuberuntime_gc.go:150] "Failed to remove container" err="failed to get container status \"b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8\": rpc error: code = NotFound desc = could not find container \"b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8\": container with ID starting with b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8 not found: ID does not exist" containerID="b9b4a3c3512fb5a4c1cbb1bbdd93ebe33b2d2e24514ff5e8f9e07594cd43e0c8" Apr 16 20:44:49.415754 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:44:49.415731 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2\": container with ID starting with cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2 not found: ID does not exist" containerID="cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2" Apr 16 20:44:49.415807 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.415763 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2"} err="failed to get container status \"cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2\": rpc error: code = NotFound desc = could not find container \"cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2\": container with ID starting with cc5aaa67cf2e651faadbc89b1532ac4c6e598dfe74c892dcc3cf914d6e8d06c2 not found: ID does not exist" Apr 16 20:44:49.415807 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.415779 2572 scope.go:117] "RemoveContainer" containerID="0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446" Apr 16 20:44:49.416052 ip-10-0-138-62 kubenswrapper[2572]: E0416 20:44:49.416026 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446\": container with ID starting with 0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446 not found: ID does not exist" containerID="0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446" Apr 16 20:44:49.416100 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.416061 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446"} err="failed to get container status \"0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446\": rpc error: code = NotFound desc = could not find container \"0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446\": container with ID starting with 0f4799e9ecd8ed9b637c5d343b6cb5bfeb46ba4050733cb246ee9ef156817446 not found: ID does not exist" Apr 16 20:44:49.762363 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:49.762329 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-xjsmb_080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf/manager/0.log" Apr 16 20:44:51.149861 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:51.149828 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02b9605c-659b-4584-bb72-da8846082a5e" path="/var/lib/kubelet/pods/02b9605c-659b-4584-bb72-da8846082a5e/volumes" Apr 16 20:44:51.150287 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:51.150272 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" path="/var/lib/kubelet/pods/87f0afae-0f58-43cc-ae75-2f48461c5521/volumes" Apr 16 20:44:52.113930 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.113891 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d58bg/must-gather-2z47l"] Apr 16 20:44:52.114279 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.114266 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" Apr 16 20:44:52.114324 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.114281 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" Apr 16 20:44:52.114324 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.114301 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" Apr 16 20:44:52.114324 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.114306 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" Apr 16 20:44:52.114324 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.114318 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="storage-initializer" Apr 16 20:44:52.114447 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.114327 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="storage-initializer" Apr 16 20:44:52.114447 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.114343 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="llm-d-routing-sidecar" Apr 16 20:44:52.114447 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.114348 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="llm-d-routing-sidecar" Apr 16 20:44:52.114447 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.114356 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="storage-initializer" Apr 16 20:44:52.114447 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.114361 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="storage-initializer" Apr 16 20:44:52.114447 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.114417 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="02b9605c-659b-4584-bb72-da8846082a5e" containerName="main" Apr 16 20:44:52.114447 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.114427 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="main" Apr 16 20:44:52.114447 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.114439 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="87f0afae-0f58-43cc-ae75-2f48461c5521" containerName="llm-d-routing-sidecar" Apr 16 20:44:52.121507 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.121484 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d58bg/must-gather-2z47l" Apr 16 20:44:52.124196 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.124166 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d58bg\"/\"default-dockercfg-fs8m8\"" Apr 16 20:44:52.124337 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.124224 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d58bg\"/\"kube-root-ca.crt\"" Apr 16 20:44:52.124337 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.124241 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d58bg\"/\"openshift-service-ca.crt\"" Apr 16 20:44:52.124743 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.124721 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d58bg/must-gather-2z47l"] Apr 16 20:44:52.184420 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.184385 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6nn\" (UniqueName: \"kubernetes.io/projected/3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3-kube-api-access-zz6nn\") pod \"must-gather-2z47l\" (UID: \"3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3\") " pod="openshift-must-gather-d58bg/must-gather-2z47l" Apr 16 20:44:52.184791 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.184447 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3-must-gather-output\") pod \"must-gather-2z47l\" (UID: \"3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3\") " pod="openshift-must-gather-d58bg/must-gather-2z47l" Apr 16 20:44:52.285849 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.285817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3-must-gather-output\") pod \"must-gather-2z47l\" (UID: \"3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3\") " pod="openshift-must-gather-d58bg/must-gather-2z47l" Apr 16 20:44:52.286017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.285911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6nn\" (UniqueName: \"kubernetes.io/projected/3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3-kube-api-access-zz6nn\") pod \"must-gather-2z47l\" (UID: \"3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3\") " pod="openshift-must-gather-d58bg/must-gather-2z47l" Apr 16 20:44:52.286159 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.286138 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3-must-gather-output\") pod \"must-gather-2z47l\" (UID: \"3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3\") " pod="openshift-must-gather-d58bg/must-gather-2z47l" Apr 16 20:44:52.293803 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.293768 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6nn\" (UniqueName: \"kubernetes.io/projected/3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3-kube-api-access-zz6nn\") pod \"must-gather-2z47l\" (UID: \"3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3\") " pod="openshift-must-gather-d58bg/must-gather-2z47l" Apr 16 20:44:52.431394 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.431316 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d58bg/must-gather-2z47l" Apr 16 20:44:52.554130 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.554101 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d58bg/must-gather-2z47l"] Apr 16 20:44:52.557621 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:44:52.557595 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eae60f4_5ea4_46b0_b10d_fb94eeb89fe3.slice/crio-1ec00d81d37f74a2464ae75b4f97bf6e75d6ca15e3adb98cc84df9c81c91734f WatchSource:0}: Error finding container 1ec00d81d37f74a2464ae75b4f97bf6e75d6ca15e3adb98cc84df9c81c91734f: Status 404 returned error can't find the container with id 1ec00d81d37f74a2464ae75b4f97bf6e75d6ca15e3adb98cc84df9c81c91734f Apr 16 20:44:52.559549 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:52.559531 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:44:53.251810 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:53.251777 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d58bg/must-gather-2z47l" event={"ID":"3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3","Type":"ContainerStarted","Data":"1ec00d81d37f74a2464ae75b4f97bf6e75d6ca15e3adb98cc84df9c81c91734f"} Apr 16 20:44:54.260791 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:54.260753 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d58bg/must-gather-2z47l" event={"ID":"3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3","Type":"ContainerStarted","Data":"3d6b30c0c702cfbdb307d35858689f5f663db371640b97bf498f3d7200c13114"} Apr 16 20:44:54.261188 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:54.260798 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d58bg/must-gather-2z47l" event={"ID":"3eae60f4-5ea4-46b0-b10d-fb94eeb89fe3","Type":"ContainerStarted","Data":"2516a9c3cf020a08179f6ecce7185af5b8f4ab32aea74ec015c33733bd693d7c"} Apr 16 20:44:54.275036 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:54.274971 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d58bg/must-gather-2z47l" podStartSLOduration=1.251376021 podStartE2EDuration="2.274954563s" podCreationTimestamp="2026-04-16 20:44:52 +0000 UTC" firstStartedPulling="2026-04-16 20:44:52.559685403 +0000 UTC m=+1983.996673889" lastFinishedPulling="2026-04-16 20:44:53.583263942 +0000 UTC m=+1985.020252431" observedRunningTime="2026-04-16 20:44:54.274629574 +0000 UTC m=+1985.711618082" watchObservedRunningTime="2026-04-16 20:44:54.274954563 +0000 UTC m=+1985.711943071" Apr 16 20:44:55.228650 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:55.228616 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-h5zvt_4ee8f3a2-dd63-4782-a67d-759803bc1b0d/global-pull-secret-syncer/0.log" Apr 16 20:44:55.326552 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:55.326514 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fw6bf_b897f3d6-5177-401c-abab-c0301641c018/konnectivity-agent/0.log" Apr 16 20:44:55.375393 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:55.375365 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-62.ec2.internal_025d9da832e6f15c9967affc15b6b9e5/haproxy/0.log" Apr 16 20:44:59.340061 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:44:59.335781 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-xjsmb_080624af-c2d1-4ce7-9bd4-ed7d3c4d61bf/manager/0.log" Apr 16 20:45:00.454190 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:00.454110 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3dc3a448-0a69-45c2-9725-08e1794e02d3/alertmanager/0.log" Apr 16 20:45:00.496282 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:00.496251 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3dc3a448-0a69-45c2-9725-08e1794e02d3/config-reloader/0.log" Apr 16 20:45:00.542133 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:00.542095 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3dc3a448-0a69-45c2-9725-08e1794e02d3/kube-rbac-proxy-web/0.log" Apr 16 20:45:00.601851 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:00.601818 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3dc3a448-0a69-45c2-9725-08e1794e02d3/kube-rbac-proxy/0.log" Apr 16 20:45:00.638539 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:00.638462 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3dc3a448-0a69-45c2-9725-08e1794e02d3/kube-rbac-proxy-metric/0.log" Apr 16 20:45:00.669176 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:00.669142 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3dc3a448-0a69-45c2-9725-08e1794e02d3/prom-label-proxy/0.log" Apr 16 20:45:00.693642 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:00.693609 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3dc3a448-0a69-45c2-9725-08e1794e02d3/init-config-reloader/0.log" Apr 16 20:45:00.734314 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:00.734232 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-tc597_611adbad-0f1f-4c66-8f17-17f5a789a903/cluster-monitoring-operator/0.log" Apr 16 20:45:00.833360 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:00.833324 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6b8899fd8-cjcz2_39ed6675-59d4-42e2-ab00-f407f8c1db04/metrics-server/0.log" Apr 16 20:45:00.892625 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:00.892592 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-g5wtx_87150cb8-5f8b-431e-9a3e-04dc45ef494c/node-exporter/0.log" Apr 16 20:45:00.913972 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:00.913931 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-g5wtx_87150cb8-5f8b-431e-9a3e-04dc45ef494c/kube-rbac-proxy/0.log" Apr 16 20:45:00.933670 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:00.933590 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-g5wtx_87150cb8-5f8b-431e-9a3e-04dc45ef494c/init-textfile/0.log" Apr 16 20:45:01.227179 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:01.227146 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9422c34e-0f4b-499a-bf52-61c9b32c315a/prometheus/0.log" Apr 16 20:45:01.247386 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:01.247359 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9422c34e-0f4b-499a-bf52-61c9b32c315a/config-reloader/0.log" Apr 16 20:45:01.268790 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:01.268764 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9422c34e-0f4b-499a-bf52-61c9b32c315a/thanos-sidecar/0.log" Apr 16 20:45:01.293178 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:01.293150 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9422c34e-0f4b-499a-bf52-61c9b32c315a/kube-rbac-proxy-web/0.log" Apr 16 20:45:01.317494 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:01.317463 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9422c34e-0f4b-499a-bf52-61c9b32c315a/kube-rbac-proxy/0.log" Apr 16 20:45:01.342488 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:01.342413 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9422c34e-0f4b-499a-bf52-61c9b32c315a/kube-rbac-proxy-thanos/0.log" Apr 16 20:45:01.366279 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:01.366246 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9422c34e-0f4b-499a-bf52-61c9b32c315a/init-config-reloader/0.log" Apr 16 20:45:01.403849 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:01.403813 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-v98ll_37613547-0ef2-4819-bd32-36b4865cf714/prometheus-operator/0.log" Apr 16 20:45:01.426746 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:01.426721 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-v98ll_37613547-0ef2-4819-bd32-36b4865cf714/kube-rbac-proxy/0.log" Apr 16 20:45:01.455506 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:01.455474 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-6rfzr_3958a2b6-30b2-4633-b471-4e059b8de73a/prometheus-operator-admission-webhook/0.log" Apr 16 20:45:02.915613 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:02.915525 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-9gbgs_8358b0b1-4c45-4118-bd03-a851a409b99e/networking-console-plugin/0.log" Apr 16 20:45:03.894117 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:03.894075 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb"] Apr 16 20:45:03.897796 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:03.897775 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:03.904472 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:03.904441 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb"] Apr 16 20:45:04.006661 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.006620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/42a54b0d-f125-4b9b-80cf-2864c519c43c-sys\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.007170 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.006777 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjgnt\" (UniqueName: \"kubernetes.io/projected/42a54b0d-f125-4b9b-80cf-2864c519c43c-kube-api-access-mjgnt\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.007170 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.006847 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/42a54b0d-f125-4b9b-80cf-2864c519c43c-lib-modules\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.007170 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.006959 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/42a54b0d-f125-4b9b-80cf-2864c519c43c-podres\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.007170 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.007021 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/42a54b0d-f125-4b9b-80cf-2864c519c43c-proc\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.107837 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.107795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/42a54b0d-f125-4b9b-80cf-2864c519c43c-sys\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.107837 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.107843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjgnt\" (UniqueName: \"kubernetes.io/projected/42a54b0d-f125-4b9b-80cf-2864c519c43c-kube-api-access-mjgnt\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.108116 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.107866 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/42a54b0d-f125-4b9b-80cf-2864c519c43c-lib-modules\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.108116 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.107921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/42a54b0d-f125-4b9b-80cf-2864c519c43c-podres\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.108116 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.107952 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/42a54b0d-f125-4b9b-80cf-2864c519c43c-proc\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.108116 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.107950 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/42a54b0d-f125-4b9b-80cf-2864c519c43c-sys\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.108116 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.108025 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/42a54b0d-f125-4b9b-80cf-2864c519c43c-proc\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.108116 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.108045 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/42a54b0d-f125-4b9b-80cf-2864c519c43c-lib-modules\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.108116 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.108053 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/42a54b0d-f125-4b9b-80cf-2864c519c43c-podres\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.115093 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.115068 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjgnt\" (UniqueName: \"kubernetes.io/projected/42a54b0d-f125-4b9b-80cf-2864c519c43c-kube-api-access-mjgnt\") pod \"perf-node-gather-daemonset-594lb\" (UID: \"42a54b0d-f125-4b9b-80cf-2864c519c43c\") " pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.213177 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.210007 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:04.363005 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:04.362969 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb"] Apr 16 20:45:04.365498 ip-10-0-138-62 kubenswrapper[2572]: W0416 20:45:04.365464 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod42a54b0d_f125_4b9b_80cf_2864c519c43c.slice/crio-18821fb23ba933b9d8ece2e1d0fc02d7469bf1fac945644332563ff97225d1e4 WatchSource:0}: Error finding container 18821fb23ba933b9d8ece2e1d0fc02d7469bf1fac945644332563ff97225d1e4: Status 404 returned error can't find the container with id 18821fb23ba933b9d8ece2e1d0fc02d7469bf1fac945644332563ff97225d1e4 Apr 16 20:45:05.134111 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:05.134074 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nwtd6_7621e257-90f3-4f74-a511-c5bfd075ff99/dns/0.log" Apr 16 20:45:05.154557 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:05.154530 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nwtd6_7621e257-90f3-4f74-a511-c5bfd075ff99/kube-rbac-proxy/0.log" Apr 16 20:45:05.221842 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:05.221809 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-d277z_8072a527-0c85-4b4a-a30d-ee0ca50bec0a/dns-node-resolver/0.log" Apr 16 20:45:05.335617 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:05.335577 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" event={"ID":"42a54b0d-f125-4b9b-80cf-2864c519c43c","Type":"ContainerStarted","Data":"4e0e2fe7c9f175b836ca4fd4a46bd453dc976c2a326f050f3b671f08dc4eea70"} Apr 16 20:45:05.335617 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:05.335620 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" event={"ID":"42a54b0d-f125-4b9b-80cf-2864c519c43c","Type":"ContainerStarted","Data":"18821fb23ba933b9d8ece2e1d0fc02d7469bf1fac945644332563ff97225d1e4"} Apr 16 20:45:05.335846 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:05.335760 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:05.352174 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:05.352120 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" podStartSLOduration=2.352100278 podStartE2EDuration="2.352100278s" podCreationTimestamp="2026-04-16 20:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:45:05.351396282 +0000 UTC m=+1996.788384790" watchObservedRunningTime="2026-04-16 20:45:05.352100278 +0000 UTC m=+1996.789088788" Apr 16 20:45:05.711623 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:05.711578 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6676cf5bc-962mq_93ae2cda-12dd-4710-b8f7-478ee5f42cf1/registry/0.log" Apr 16 20:45:05.732443 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:05.732402 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4fgxn_c470127f-e9ca-44ba-bcef-cc2cd68cdcdc/node-ca/0.log" Apr 16 20:45:07.145588 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:07.145561 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tngt6_2559e4d7-87c4-4654-a66a-cf29280da85b/serve-healthcheck-canary/0.log" Apr 16 20:45:07.670086 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:07.670047 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-84qmv_b2a87320-5da2-4e40-93d7-fffcb5c0c165/kube-rbac-proxy/0.log" Apr 16 20:45:07.691017 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:07.690993 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-84qmv_b2a87320-5da2-4e40-93d7-fffcb5c0c165/exporter/0.log" Apr 16 20:45:07.713365 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:07.713339 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-84qmv_b2a87320-5da2-4e40-93d7-fffcb5c0c165/extractor/0.log" Apr 16 20:45:10.304065 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:10.304037 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-986f4df7-wvgks_e68d2697-1a8a-450e-8bbd-a1d3bd0decdd/manager/0.log" Apr 16 20:45:10.928865 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:10.928822 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-549c5f665b-98qgl_7e560579-1b0b-46ef-a7e6-1944f268ce2a/manager/0.log" Apr 16 20:45:10.947593 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:10.947566 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-p4pqc_12141ef2-0d8f-4ea5-aa93-d6464fa4059f/server/0.log" Apr 16 20:45:11.150855 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:11.150822 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-mfsr7_59ac456f-4f01-491a-8b3a-4a12c727c4df/s3-init/0.log" Apr 16 20:45:11.180929 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:11.180843 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-s6jzs_1f254ec8-2398-4afe-a00a-67591c7a1251/seaweedfs/0.log" Apr 16 20:45:11.349374 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:11.349346 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-d58bg/perf-node-gather-daemonset-594lb" Apr 16 20:45:16.292098 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:16.292044 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-n994t_b6b38707-68a6-4045-bbe5-f614a88439b1/kube-storage-version-migrator-operator/1.log" Apr 16 20:45:16.293864 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:16.293833 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-n994t_b6b38707-68a6-4045-bbe5-f614a88439b1/kube-storage-version-migrator-operator/0.log" Apr 16 20:45:17.475057 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:17.475024 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l75p2_39673bf6-83cf-45c6-9476-3700a9d91e35/kube-multus-additional-cni-plugins/0.log" Apr 16 20:45:17.497269 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:17.497229 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l75p2_39673bf6-83cf-45c6-9476-3700a9d91e35/egress-router-binary-copy/0.log" Apr 16 20:45:17.517974 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:17.517938 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l75p2_39673bf6-83cf-45c6-9476-3700a9d91e35/cni-plugins/0.log" Apr 16 20:45:17.537799 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:17.537776 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l75p2_39673bf6-83cf-45c6-9476-3700a9d91e35/bond-cni-plugin/0.log" Apr 16 20:45:17.561584 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:17.561556 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l75p2_39673bf6-83cf-45c6-9476-3700a9d91e35/routeoverride-cni/0.log" Apr 16 20:45:17.582091 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:17.582065 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l75p2_39673bf6-83cf-45c6-9476-3700a9d91e35/whereabouts-cni-bincopy/0.log" Apr 16 20:45:17.605727 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:17.605693 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l75p2_39673bf6-83cf-45c6-9476-3700a9d91e35/whereabouts-cni/0.log" Apr 16 20:45:17.830019 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:17.829974 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lc4v5_b5c961b8-9f06-4e0b-9e96-8bbef36b8380/kube-multus/0.log" Apr 16 20:45:17.855918 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:17.855891 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5bd8l_8e680955-60f7-4aaf-9aeb-b5efc9759ed4/network-metrics-daemon/0.log" Apr 16 20:45:17.873723 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:17.873693 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5bd8l_8e680955-60f7-4aaf-9aeb-b5efc9759ed4/kube-rbac-proxy/0.log" Apr 16 20:45:19.321477 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:19.321450 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-controller/0.log" Apr 16 20:45:19.338432 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:19.338405 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/0.log" Apr 16 20:45:19.357751 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:19.357722 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovn-acl-logging/1.log" Apr 16 20:45:19.383426 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:19.383398 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/kube-rbac-proxy-node/0.log" Apr 16 20:45:19.405910 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:19.405867 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:45:19.423499 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:19.423467 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/northd/0.log" Apr 16 20:45:19.445203 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:19.445181 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/nbdb/0.log" Apr 16 20:45:19.467270 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:19.467244 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/sbdb/0.log" Apr 16 20:45:19.659566 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:19.659486 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gf47_10232a70-9b5f-414a-8efd-b5cff05a4f12/ovnkube-controller/0.log" Apr 16 20:45:20.707553 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:20.707519 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-vv4gp_9c2c7589-5734-4bc8-8907-ab28995f2fbc/check-endpoints/0.log" Apr 16 20:45:20.752174 ip-10-0-138-62 kubenswrapper[2572]: I0416 20:45:20.752144 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-86kzx_1d09f25d-7edb-4aa8-a44b-bfe6b932ecf9/network-check-target-container/0.log"