Apr 16 18:28:41.245332 ip-10-0-140-154 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:28:41.245344 ip-10-0-140-154 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:28:41.245354 ip-10-0-140-154 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:28:41.245699 ip-10-0-140-154 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:28:51.309908 ip-10-0-140-154 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:28:51.309926 ip-10-0-140-154 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 67ce56cd1ea74d82a1516b6c6260a1b8 -- Apr 16 18:31:13.744107 ip-10-0-140-154 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:31:14.237958 ip-10-0-140-154 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:31:14.237958 ip-10-0-140-154 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:31:14.237958 ip-10-0-140-154 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:31:14.237958 ip-10-0-140-154 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:31:14.237958 ip-10-0-140-154 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:31:14.241548 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.241358 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:31:14.247779 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247753 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:31:14.247779 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247775 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:31:14.247779 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247779 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:31:14.247779 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247784 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247787 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247791 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247794 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247797 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247799 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247802 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247805 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247821 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247824 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247829 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247833 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247835 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247839 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247841 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247844 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247846 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247849 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247852 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247855 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:31:14.247922 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247857 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247860 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247863 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247866 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247868 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247871 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247875 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247878 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247880 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247883 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247886 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247888 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247892 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247895 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247898 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247901 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247904 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247907 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247909 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247912 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:31:14.248399 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247914 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247923 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247926 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247929 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247931 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247934 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247937 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247940 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247942 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247945 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247948 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247950 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247953 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247955 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247958 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247960 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247963 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247966 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247969 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247973 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:31:14.249199 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247976 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247978 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247981 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247984 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247987 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247989 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247992 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247995 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.247997 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248000 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248004 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248008 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248010 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248013 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248023 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248026 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248029 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248032 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248034 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:31:14.249842 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248037 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:31:14.250303 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248039 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:31:14.250303 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248042 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:31:14.250303 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.248045 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:31:14.250966 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250941 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:31:14.250966 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250964 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:31:14.250966 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250968 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250972 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250975 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250978 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250981 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250983 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250986 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250989 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250992 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250994 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250997 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.250999 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251002 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251004 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251007 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251011 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251015 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251018 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251020 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251023 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:31:14.251054 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251025 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251028 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251031 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251034 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251036 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251039 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251041 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251044 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251046 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251049 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251052 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251054 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251057 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251059 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251063 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251065 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251068 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251071 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251074 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251076 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:31:14.251533 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251078 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251081 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251086 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251091 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251095 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251097 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251100 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251103 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251106 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251109 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251112 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251115 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251118 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251121 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251124 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251127 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251130 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251133 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251135 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:31:14.252053 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251138 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251141 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251143 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251146 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251148 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251151 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251154 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251156 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251159 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251161 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251164 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251166 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251170 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251173 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251175 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251177 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251180 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251184 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251186 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251189 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:31:14.252521 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251192 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251194 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251197 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251199 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251202 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251278 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251290 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251301 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251308 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251315 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251318 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251323 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251328 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251332 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251335 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251339 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251342 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251345 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251348 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251351 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251354 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251357 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251360 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251363 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:31:14.253024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251367 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251370 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251373 2576 flags.go:64] FLAG: --config-dir="" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251376 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251380 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251383 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251387 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251390 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251394 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251397 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251400 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251403 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251406 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251409 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251413 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251418 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251421 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251424 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251427 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251430 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251435 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251438 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251441 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251445 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251448 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:31:14.253781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251452 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251455 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251458 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251461 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251464 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251467 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251470 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251473 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251476 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251478 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251481 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251485 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251488 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251491 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251496 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251499 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251502 2576 flags.go:64] FLAG: --help="false" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251505 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251508 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251511 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251514 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251518 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251522 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251525 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:31:14.254392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251528 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251530 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251533 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251536 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251540 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251543 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251546 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251549 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251552 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251555 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251557 2576 flags.go:64] FLAG: --lock-file="" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251560 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251563 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251566 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251572 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251575 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251578 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251580 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251583 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251587 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251590 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251592 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251598 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251601 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251605 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 18:31:14.254982 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251608 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251612 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251615 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251618 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251621 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251624 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251627 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251636 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251639 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251642 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251646 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251648 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251654 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251657 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251660 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251663 2576 flags.go:64] FLAG: --port="10250" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251666 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251669 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-021c2333ad2aa8b6e" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251672 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251675 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251678 2576 flags.go:64] FLAG: --register-node="true" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251681 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251684 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251689 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:31:14.255598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251692 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251695 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251698 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251702 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251705 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251708 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251711 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251715 2576 flags.go:64] FLAG: --runonce="false" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251718 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251721 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251724 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251727 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251730 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251736 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251753 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251757 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251760 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251763 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251766 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251770 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251773 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251776 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251779 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251785 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251788 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:31:14.256184 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251792 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251797 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251800 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251803 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251807 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251810 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251813 2576 flags.go:64] FLAG: --v="2" Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251817 2576 flags.go:64] FLAG: --version="false" Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251822 2576 flags.go:64] FLAG: --vmodule="" Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251826 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.251830 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251934 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251938 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251941 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251944 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251948 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251951 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251954 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251957 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251960 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251965 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251968 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:31:14.256833 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251970 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251973 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251976 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251979 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251982 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251984 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251987 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251990 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251992 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251995 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.251998 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252001 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252004 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252007 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252009 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252012 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252015 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252017 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252020 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252023 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:31:14.257363 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252025 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252028 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252030 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252033 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252035 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252038 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252041 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252044 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252046 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252049 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252053 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252056 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252059 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252063 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252066 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252069 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252071 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252074 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252077 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252079 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:31:14.257924 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252082 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252085 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252089 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252093 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252097 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252101 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252104 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252106 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252109 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252112 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252114 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252117 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252120 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252123 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252126 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252128 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252131 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252133 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252136 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:31:14.258421 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252146 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252149 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252151 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252155 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252157 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252160 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252164 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252167 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252170 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252172 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252175 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252177 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252180 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252183 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252185 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.252188 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:31:14.258909 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.252826 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:31:14.259915 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.259892 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:31:14.259957 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.259916 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:31:14.259988 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.259974 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:31:14.259988 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.259980 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:31:14.259988 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.259983 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:31:14.259988 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.259986 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:31:14.259988 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.259989 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.259992 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.259996 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.259998 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260001 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260004 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260007 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260010 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260013 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260017 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260019 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260022 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260025 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260028 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260030 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260033 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260036 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260038 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260041 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260044 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:31:14.260119 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260046 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260049 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260051 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260054 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260057 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260059 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260063 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260065 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260068 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260070 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260074 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260076 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260079 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260082 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260086 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260089 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260091 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260094 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260097 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260100 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:31:14.260661 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260102 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260105 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260109 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260113 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260117 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260120 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260125 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260128 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260132 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260135 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260138 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260141 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260144 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260146 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260149 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260152 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260155 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260157 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260160 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260163 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:31:14.261161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260166 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260169 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260171 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260174 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260177 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260179 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260182 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260185 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260188 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260190 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260193 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260195 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260198 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260200 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260203 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260207 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260209 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260212 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260215 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:31:14.261649 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260217 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260220 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260222 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.260228 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260353 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260357 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260361 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260364 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260367 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260370 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260373 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260377 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260381 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260384 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260387 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:31:14.262126 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260390 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260392 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260395 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260398 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260401 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260404 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260406 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260410 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260414 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260417 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260419 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260422 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260425 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260427 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260431 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260433 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260436 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260439 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260453 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:31:14.262496 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260456 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260459 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260461 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260464 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260467 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260470 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260472 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260475 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260477 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260480 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260483 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260485 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260488 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260490 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260493 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260495 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260499 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260502 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260504 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260507 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:31:14.262985 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260510 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260512 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260515 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260518 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260520 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260523 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260526 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260528 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260532 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260534 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260537 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260540 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260542 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260545 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260547 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260550 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260552 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260555 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260558 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260561 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260563 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:31:14.263502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260565 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260568 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260570 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260573 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260576 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260578 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260580 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260584 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260587 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260590 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260592 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260595 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260597 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260600 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:14.260603 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:31:14.264039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.260607 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:31:14.264407 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.261498 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:31:14.266835 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.266816 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:31:14.267888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.267871 2576 server.go:1019] "Starting client certificate rotation" Apr 16 18:31:14.268010 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.267993 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:31:14.268830 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.268816 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:31:14.296273 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.296247 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:31:14.300625 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.300598 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:31:14.315600 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.315574 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:31:14.321749 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.321717 2576 log.go:25] "Validated CRI v1 image API" Apr 16 18:31:14.323166 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.323146 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:31:14.327576 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.327514 2576 fs.go:135] Filesystem UUIDs: map[61300ce0-f3bf-4199-8099-c6bf32487820:/dev/nvme0n1p3 76b9886d-6b21-4d93-ba03-337d333491e1:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 18:31:14.327707 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.327580 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:31:14.327707 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.327682 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:31:14.334205 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.334035 2576 manager.go:217] Machine: {Timestamp:2026-04-16 18:31:14.332237184 +0000 UTC m=+0.456764837 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098189 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fd33980971922e6ebe415bb31bb82 SystemUUID:ec2fd339-8097-1922-e6eb-e415bb31bb82 BootID:67ce56cd-1ea7-4d82-a151-6b6c6260a1b8 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d3:02:ee:6e:e9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d3:02:ee:6e:e9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:8c:a2:60:52:5e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:31:14.334205 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.334184 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:31:14.334368 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.334284 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:31:14.336244 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.336214 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:31:14.336422 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.336244 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-154.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:31:14.336485 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.336434 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:31:14.336485 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.336450 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:31:14.336485 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.336465 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:31:14.338725 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.338709 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:31:14.339593 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.339581 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:31:14.339731 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.339721 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:31:14.342663 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.342647 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:31:14.342715 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.342666 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:31:14.342715 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.342683 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:31:14.342715 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.342695 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:31:14.342715 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.342704 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:31:14.343810 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.343797 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:31:14.343863 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.343816 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:31:14.347216 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.347197 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:31:14.349090 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.349073 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:31:14.350724 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.350705 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:31:14.350724 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.350722 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:31:14.350724 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.350728 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:31:14.350884 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.350734 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:31:14.350884 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.350752 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:31:14.350884 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.350758 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:31:14.350884 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.350764 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:31:14.350884 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.350772 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:31:14.350884 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.350778 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:31:14.350884 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.350791 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:31:14.350884 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.350805 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:31:14.350884 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.350814 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:31:14.351664 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.351645 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mfx58" Apr 16 18:31:14.351726 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.351707 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:31:14.351726 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.351720 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:31:14.355345 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.355330 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:31:14.355436 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.355374 2576 server.go:1295] "Started kubelet" Apr 16 18:31:14.355507 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.355469 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:31:14.355579 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.355527 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:31:14.355645 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.355607 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:31:14.356344 ip-10-0-140-154 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:31:14.356963 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.356928 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:31:14.358185 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.358162 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:14.358185 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.358167 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:31:14.358301 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.358210 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:31:14.358350 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.358338 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:31:14.362470 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.361542 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55ca51865 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.355345509 +0000 UTC m=+0.479873168,LastTimestamp:2026-04-16 18:31:14.355345509 +0000 UTC m=+0.479873168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.362674 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.362660 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:31:14.363576 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.363553 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:31:14.364259 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.364237 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:31:14.364259 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.364261 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:31:14.364422 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.364411 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:31:14.366081 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.364492 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:31:14.366081 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.364504 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:31:14.366081 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.364701 2576 factory.go:55] Registering systemd factory Apr 16 18:31:14.366081 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.364716 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:31:14.366313 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.366161 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:14.366468 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.366438 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:31:14.367932 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.367912 2576 factory.go:153] Registering CRI-O factory Apr 16 18:31:14.367932 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.367932 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 18:31:14.368083 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.368003 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:31:14.368083 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.368025 2576 factory.go:103] Registering Raw factory Apr 16 18:31:14.368083 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.368040 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 18:31:14.368631 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.368618 2576 manager.go:319] Starting recovery of all containers Apr 16 18:31:14.370461 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.370273 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:31:14.376268 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.376234 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:31:14.379937 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.379917 2576 manager.go:324] Recovery completed Apr 16 18:31:14.384671 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.384654 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:14.387348 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.387331 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:14.387421 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.387362 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:14.387421 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.387373 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:14.387958 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.387940 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:31:14.387958 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.387956 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:31:14.388085 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.387974 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:31:14.390159 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.390145 2576 policy_none.go:49] "None policy: Start" Apr 16 18:31:14.390203 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.390165 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:31:14.390203 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.390176 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:31:14.399092 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.399009 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38734808 +0000 UTC m=+0.511875734,LastTimestamp:2026-04-16 18:31:14.38734808 +0000 UTC m=+0.511875734,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.411417 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.411208 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.387366748 +0000 UTC m=+0.511894401,LastTimestamp:2026-04-16 18:31:14.387366748 +0000 UTC m=+0.511894401,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.422994 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.422908 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8ddd34 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38737746 +0000 UTC m=+0.511905113,LastTimestamp:2026-04-16 18:31:14.38737746 +0000 UTC m=+0.511905113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.437206 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.430563 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 18:31:14.437206 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.430595 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:31:14.437206 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.430606 2576 server.go:85] "Starting device plugin registration server" Apr 16 18:31:14.437206 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.430904 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:31:14.437206 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.430916 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:31:14.437206 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.430998 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:31:14.437206 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.431073 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:31:14.437206 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.431079 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:31:14.437206 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.431718 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:31:14.437206 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.431815 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:14.453733 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.453633 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e561d36fba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.442268602 +0000 UTC m=+0.566796242,LastTimestamp:2026-04-16 18:31:14.442268602 +0000 UTC m=+0.566796242,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.499496 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.499402 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:31:14.500712 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.500692 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:31:14.500820 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.500725 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:31:14.500820 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.500769 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:31:14.500820 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.500792 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:31:14.500974 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.500839 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:31:14.509620 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.509588 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 16 18:31:14.531900 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.531855 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:14.532995 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.532974 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:14.533105 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.533011 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:14.533105 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.533024 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:14.533105 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.533056 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.542470 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.542374 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38734808 +0000 UTC m=+0.511875734,LastTimestamp:2026-04-16 18:31:14.532993606 +0000 UTC m=+0.657521260,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.550575 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.550542 2576 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.550711 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.550563 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.387366748 +0000 UTC m=+0.511894401,LastTimestamp:2026-04-16 18:31:14.533018344 +0000 UTC m=+0.657545998,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.552537 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.552465 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8ddd34\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8ddd34 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38737746 +0000 UTC m=+0.511905113,LastTimestamp:2026-04-16 18:31:14.53302775 +0000 UTC m=+0.657555404,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.568799 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.568766 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="400ms" Apr 16 18:31:14.601924 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.601871 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-154.ec2.internal"] Apr 16 18:31:14.601993 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.601981 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:14.603082 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.603068 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:14.603155 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.603095 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:14.603155 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.603106 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:14.604552 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.604539 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:14.604677 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.604660 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.604715 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.604693 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:14.605292 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.605278 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:14.605367 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.605293 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:14.605367 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.605302 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:14.605367 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.605311 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:14.605367 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.605316 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:14.605367 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.605328 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:14.607021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.607006 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.607087 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.607032 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:14.607768 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.607733 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:14.607846 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.607779 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:14.607846 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.607790 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:14.612039 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.611963 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38734808 +0000 UTC m=+0.511875734,LastTimestamp:2026-04-16 18:31:14.603081738 +0000 UTC m=+0.727609392,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.619379 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.619290 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.387366748 +0000 UTC m=+0.511894401,LastTimestamp:2026-04-16 18:31:14.603099534 +0000 UTC m=+0.727627187,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.624433 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.624410 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-154.ec2.internal\" not found" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.627225 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.627153 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8ddd34\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8ddd34 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38737746 +0000 UTC m=+0.511905113,LastTimestamp:2026-04-16 18:31:14.603109741 +0000 UTC m=+0.727637395,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.629489 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.629469 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-154.ec2.internal\" not found" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.636295 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.636213 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38734808 +0000 UTC m=+0.511875734,LastTimestamp:2026-04-16 18:31:14.605289947 +0000 UTC m=+0.729817601,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.648297 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.648207 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.387366748 +0000 UTC m=+0.511894401,LastTimestamp:2026-04-16 18:31:14.60530692 +0000 UTC m=+0.729834575,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.657367 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.657282 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38734808 +0000 UTC m=+0.511875734,LastTimestamp:2026-04-16 18:31:14.605305296 +0000 UTC m=+0.729832954,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.665900 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.665874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/28c07a342a30c0e354482d7284dcbb2c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal\" (UID: \"28c07a342a30c0e354482d7284dcbb2c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.666022 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.665908 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28c07a342a30c0e354482d7284dcbb2c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal\" (UID: \"28c07a342a30c0e354482d7284dcbb2c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.666022 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.665923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c2e9dc4277c032ca5bf1e53fbceaa447-config\") pod \"kube-apiserver-proxy-ip-10-0-140-154.ec2.internal\" (UID: \"c2e9dc4277c032ca5bf1e53fbceaa447\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.667088 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.667004 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8ddd34\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8ddd34 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38737746 +0000 UTC m=+0.511905113,LastTimestamp:2026-04-16 18:31:14.605314968 +0000 UTC m=+0.729842623,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.674945 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.674857 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.387366748 +0000 UTC m=+0.511894401,LastTimestamp:2026-04-16 18:31:14.605321448 +0000 UTC m=+0.729849105,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.686927 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.686837 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8ddd34\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8ddd34 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38737746 +0000 UTC m=+0.511905113,LastTimestamp:2026-04-16 18:31:14.605331793 +0000 UTC m=+0.729859448,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.696404 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.696322 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38734808 +0000 UTC m=+0.511875734,LastTimestamp:2026-04-16 18:31:14.60776691 +0000 UTC m=+0.732294565,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.705859 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.705766 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.387366748 +0000 UTC m=+0.511894401,LastTimestamp:2026-04-16 18:31:14.607785106 +0000 UTC m=+0.732312759,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.714372 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.714289 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8ddd34\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8ddd34 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38737746 +0000 UTC m=+0.511905113,LastTimestamp:2026-04-16 18:31:14.607793676 +0000 UTC m=+0.732321329,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.751429 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.751361 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:14.752372 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.752357 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:14.752455 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.752388 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:14.752455 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.752404 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:14.752455 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.752434 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.761043 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.760965 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38734808 +0000 UTC m=+0.511875734,LastTimestamp:2026-04-16 18:31:14.752372745 +0000 UTC m=+0.876900398,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.766203 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.766181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/28c07a342a30c0e354482d7284dcbb2c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal\" (UID: \"28c07a342a30c0e354482d7284dcbb2c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.766266 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.766226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28c07a342a30c0e354482d7284dcbb2c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal\" (UID: \"28c07a342a30c0e354482d7284dcbb2c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.766266 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.766252 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c2e9dc4277c032ca5bf1e53fbceaa447-config\") pod \"kube-apiserver-proxy-ip-10-0-140-154.ec2.internal\" (UID: \"c2e9dc4277c032ca5bf1e53fbceaa447\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.766339 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.766284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/28c07a342a30c0e354482d7284dcbb2c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal\" (UID: \"28c07a342a30c0e354482d7284dcbb2c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.766339 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.766289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c2e9dc4277c032ca5bf1e53fbceaa447-config\") pod \"kube-apiserver-proxy-ip-10-0-140-154.ec2.internal\" (UID: \"c2e9dc4277c032ca5bf1e53fbceaa447\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.766404 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.766348 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28c07a342a30c0e354482d7284dcbb2c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal\" (UID: \"28c07a342a30c0e354482d7284dcbb2c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.768518 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.768500 2576 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.768518 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.768455 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.387366748 +0000 UTC m=+0.511894401,LastTimestamp:2026-04-16 18:31:14.752395651 +0000 UTC m=+0.876923310,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.778555 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.778478 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8ddd34\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8ddd34 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38737746 +0000 UTC m=+0.511905113,LastTimestamp:2026-04-16 18:31:14.752408991 +0000 UTC m=+0.876936645,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:14.928364 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.928320 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.932338 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:14.932319 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-154.ec2.internal" Apr 16 18:31:14.978367 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:14.978337 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="800ms" Apr 16 18:31:15.169133 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:15.169100 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:15.185616 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:15.185594 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:15.185785 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:15.185628 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:15.185785 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:15.185640 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:15.185785 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:15.185676 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:15.196596 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:15.196504 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8d6a70 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.38734808 +0000 UTC m=+0.511875734,LastTimestamp:2026-04-16 18:31:15.185614017 +0000 UTC m=+1.310141671,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:15.205787 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:15.205734 2576 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:15.205940 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:15.205828 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-154.ec2.internal.18a6e9e55e8db35c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-154.ec2.internal,UID:ip-10-0-140-154.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-140-154.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:14.387366748 +0000 UTC m=+0.511894401,LastTimestamp:2026-04-16 18:31:15.185632849 +0000 UTC m=+1.310160502,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:15.234351 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:15.234309 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:31:15.260860 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:15.260825 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:31:15.369241 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:15.369211 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:15.453665 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:15.453609 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28c07a342a30c0e354482d7284dcbb2c.slice/crio-855e91e762ebd8d32cb8668f0e6fcb7f7a8280c3a3fe0a5a276cfbbb6861b876 WatchSource:0}: Error finding container 855e91e762ebd8d32cb8668f0e6fcb7f7a8280c3a3fe0a5a276cfbbb6861b876: Status 404 returned error can't find the container with id 855e91e762ebd8d32cb8668f0e6fcb7f7a8280c3a3fe0a5a276cfbbb6861b876 Apr 16 18:31:15.454261 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:15.454234 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2e9dc4277c032ca5bf1e53fbceaa447.slice/crio-3137911c06a5df288feab64f564df064624f43b5f730a4e4e702ac1d6ef078b6 WatchSource:0}: Error finding container 3137911c06a5df288feab64f564df064624f43b5f730a4e4e702ac1d6ef078b6: Status 404 returned error can't find the container with id 3137911c06a5df288feab64f564df064624f43b5f730a4e4e702ac1d6ef078b6 Apr 16 18:31:15.459289 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:15.459269 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:31:15.470225 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:15.470131 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-140-154.ec2.internal.18a6e9e59e756909 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-140-154.ec2.internal,UID:c2e9dc4277c032ca5bf1e53fbceaa447,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f2e4763905898d3870f64ebc9721d8d43ae2973f4ba295d48f84e36e6f72d013\",Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:15.459516681 +0000 UTC m=+1.584044322,LastTimestamp:2026-04-16 18:31:15.459516681 +0000 UTC m=+1.584044322,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:15.479424 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:15.479334 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e59e766da8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal,UID:28c07a342a30c0e354482d7284dcbb2c,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3ea0ee7dfd5669815239bc61b06d7cc81d6950b90db35679d18fb4b8034564f6\",Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:15.4595834 +0000 UTC m=+1.584111059,LastTimestamp:2026-04-16 18:31:15.4595834 +0000 UTC m=+1.584111059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:15.504519 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:15.504469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" event={"ID":"28c07a342a30c0e354482d7284dcbb2c","Type":"ContainerStarted","Data":"855e91e762ebd8d32cb8668f0e6fcb7f7a8280c3a3fe0a5a276cfbbb6861b876"} Apr 16 18:31:15.505481 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:15.505457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-154.ec2.internal" event={"ID":"c2e9dc4277c032ca5bf1e53fbceaa447","Type":"ContainerStarted","Data":"3137911c06a5df288feab64f564df064624f43b5f730a4e4e702ac1d6ef078b6"} Apr 16 18:31:15.570847 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:15.570813 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 16 18:31:15.787929 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:15.787895 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="1.6s" Apr 16 18:31:15.803843 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:15.803810 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:31:16.006323 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:16.006239 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:16.013825 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:16.013786 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:16.013825 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:16.013824 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:16.014032 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:16.013838 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:16.014032 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:16.013873 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:16.036894 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:16.036856 2576 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:16.367725 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:16.367647 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:17.057617 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:17.057533 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-140-154.ec2.internal.18a6e9e5fd2050ae kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-140-154.ec2.internal,UID:c2e9dc4277c032ca5bf1e53fbceaa447,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f2e4763905898d3870f64ebc9721d8d43ae2973f4ba295d48f84e36e6f72d013\" in 1.588s (1.588s including waiting). Image size: 488358964 bytes.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:17.047775406 +0000 UTC m=+3.172303062,LastTimestamp:2026-04-16 18:31:17.047775406 +0000 UTC m=+3.172303062,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:17.067931 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:17.067841 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e5fd296961 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal,UID:28c07a342a30c0e354482d7284dcbb2c,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3ea0ee7dfd5669815239bc61b06d7cc81d6950b90db35679d18fb4b8034564f6\" in 1.588s (1.588s including waiting). Image size: 468452150 bytes.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:17.048371553 +0000 UTC m=+3.172899208,LastTimestamp:2026-04-16 18:31:17.048371553 +0000 UTC m=+3.172899208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:17.096240 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:17.096213 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:31:17.133607 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:17.133504 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-140-154.ec2.internal.18a6e9e6018c6a2d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-140-154.ec2.internal,UID:c2e9dc4277c032ca5bf1e53fbceaa447,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Created,Message:Created container: haproxy,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:17.121968685 +0000 UTC m=+3.246496340,LastTimestamp:2026-04-16 18:31:17.121968685 +0000 UTC m=+3.246496340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:17.145636 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:17.145551 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-140-154.ec2.internal.18a6e9e601eb2889 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-140-154.ec2.internal,UID:c2e9dc4277c032ca5bf1e53fbceaa447,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Started,Message:Started container haproxy,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:17.128177801 +0000 UTC m=+3.252705458,LastTimestamp:2026-04-16 18:31:17.128177801 +0000 UTC m=+3.252705458,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:17.367633 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:17.367549 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:17.397059 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:17.397023 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="3.2s" Apr 16 18:31:17.514096 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:17.514054 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-154.ec2.internal" event={"ID":"c2e9dc4277c032ca5bf1e53fbceaa447","Type":"ContainerStarted","Data":"14086560a4df9efac5f4804f5550acf25a603dde45115536ff78d6a4586bae2c"} Apr 16 18:31:17.514252 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:17.514125 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:17.515363 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:17.515343 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:17.515467 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:17.515371 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:17.515467 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:17.515381 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:17.515539 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:17.515525 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-154.ec2.internal\" not found" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:17.620021 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:17.619904 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e61e937918 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal,UID:28c07a342a30c0e354482d7284dcbb2c,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:17.60897052 +0000 UTC m=+3.733498183,LastTimestamp:2026-04-16 18:31:17.60897052 +0000 UTC m=+3.733498183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:17.630385 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:17.630311 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e61f1c53eb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal,UID:28c07a342a30c0e354482d7284dcbb2c,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:17.617939435 +0000 UTC m=+3.742467099,LastTimestamp:2026-04-16 18:31:17.617939435 +0000 UTC m=+3.742467099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:17.637408 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:17.637389 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:17.638313 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:17.638296 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:17.638408 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:17.638325 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:17.638408 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:17.638336 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:17.638408 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:17.638364 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:17.660313 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:17.660282 2576 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:18.030815 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:18.030698 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:31:18.102701 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:18.102663 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 16 18:31:18.171025 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:18.170992 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:31:18.367642 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:18.367565 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:18.516695 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:18.516646 2576 generic.go:358] "Generic (PLEG): container finished" podID="28c07a342a30c0e354482d7284dcbb2c" containerID="6e1bb215c73c07c2a82ab21905e69cfa20f0021cbae8bd2bd3fcc1710727a9b4" exitCode=0 Apr 16 18:31:18.517155 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:18.516753 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" event={"ID":"28c07a342a30c0e354482d7284dcbb2c","Type":"ContainerDied","Data":"6e1bb215c73c07c2a82ab21905e69cfa20f0021cbae8bd2bd3fcc1710727a9b4"} Apr 16 18:31:18.517155 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:18.516772 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:18.517155 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:18.516761 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:18.517605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:18.517586 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:18.517696 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:18.517622 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:18.517696 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:18.517632 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:18.517696 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:18.517587 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:18.517696 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:18.517694 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:18.517860 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:18.517706 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:18.517860 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:18.517791 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-154.ec2.internal\" not found" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:18.517935 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:18.517912 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-154.ec2.internal\" not found" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:18.529887 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:18.529807 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e654e01ee8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal,UID:28c07a342a30c0e354482d7284dcbb2c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3ea0ee7dfd5669815239bc61b06d7cc81d6950b90db35679d18fb4b8034564f6\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:18.519963368 +0000 UTC m=+4.644491028,LastTimestamp:2026-04-16 18:31:18.519963368 +0000 UTC m=+4.644491028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:18.626923 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:18.626822 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e65abf6414 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal,UID:28c07a342a30c0e354482d7284dcbb2c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:18.618481684 +0000 UTC m=+4.743009343,LastTimestamp:2026-04-16 18:31:18.618481684 +0000 UTC m=+4.743009343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:18.635071 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:18.634958 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e65b309129 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal,UID:28c07a342a30c0e354482d7284dcbb2c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:18.625898793 +0000 UTC m=+4.750426459,LastTimestamp:2026-04-16 18:31:18.625898793 +0000 UTC m=+4.750426459,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:19.368450 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:19.368274 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:19.519994 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:19.519967 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/0.log" Apr 16 18:31:19.520400 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:19.520277 2576 generic.go:358] "Generic (PLEG): container finished" podID="28c07a342a30c0e354482d7284dcbb2c" containerID="ec704e35cc76e46059c2e7119be339f0ab3efecbc710c66df57e41711d8756dc" exitCode=1 Apr 16 18:31:19.520400 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:19.520309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" event={"ID":"28c07a342a30c0e354482d7284dcbb2c","Type":"ContainerDied","Data":"ec704e35cc76e46059c2e7119be339f0ab3efecbc710c66df57e41711d8756dc"} Apr 16 18:31:19.520400 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:19.520363 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:19.523336 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:19.523316 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:19.523464 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:19.523346 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:19.523464 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:19.523356 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:19.523546 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:19.523516 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-154.ec2.internal\" not found" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:19.523576 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:19.523562 2576 scope.go:117] "RemoveContainer" containerID="ec704e35cc76e46059c2e7119be339f0ab3efecbc710c66df57e41711d8756dc" Apr 16 18:31:19.535085 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:19.534999 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e654e01ee8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e654e01ee8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal,UID:28c07a342a30c0e354482d7284dcbb2c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3ea0ee7dfd5669815239bc61b06d7cc81d6950b90db35679d18fb4b8034564f6\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:18.519963368 +0000 UTC m=+4.644491028,LastTimestamp:2026-04-16 18:31:19.525421356 +0000 UTC m=+5.649949017,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:19.630462 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:19.630371 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e65abf6414\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e65abf6414 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal,UID:28c07a342a30c0e354482d7284dcbb2c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:18.618481684 +0000 UTC m=+4.743009343,LastTimestamp:2026-04-16 18:31:19.621171015 +0000 UTC m=+5.745698669,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:19.640268 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:19.640180 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e65b309129\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e65b309129 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal,UID:28c07a342a30c0e354482d7284dcbb2c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:18.625898793 +0000 UTC m=+4.750426459,LastTimestamp:2026-04-16 18:31:19.628701534 +0000 UTC m=+5.753229189,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:20.367649 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.367609 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:20.524087 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.524049 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/1.log" Apr 16 18:31:20.525309 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.525276 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/0.log" Apr 16 18:31:20.525656 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.525637 2576 generic.go:358] "Generic (PLEG): container finished" podID="28c07a342a30c0e354482d7284dcbb2c" containerID="2d42a36a17593f58ffa6056be36111e52abd8d73aa7f527ca5a91e79b9b1338a" exitCode=1 Apr 16 18:31:20.525729 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.525672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" event={"ID":"28c07a342a30c0e354482d7284dcbb2c","Type":"ContainerDied","Data":"2d42a36a17593f58ffa6056be36111e52abd8d73aa7f527ca5a91e79b9b1338a"} Apr 16 18:31:20.525729 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.525702 2576 scope.go:117] "RemoveContainer" containerID="ec704e35cc76e46059c2e7119be339f0ab3efecbc710c66df57e41711d8756dc" Apr 16 18:31:20.525831 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.525776 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:20.526147 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:20.526124 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:31:20.526571 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.526554 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:20.526637 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.526585 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:20.526637 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.526596 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:20.526998 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:20.526829 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-154.ec2.internal\" not found" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:20.526998 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.526874 2576 scope.go:117] "RemoveContainer" containerID="2d42a36a17593f58ffa6056be36111e52abd8d73aa7f527ca5a91e79b9b1338a" Apr 16 18:31:20.526998 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:20.526995 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_openshift-machine-config-operator(28c07a342a30c0e354482d7284dcbb2c)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" podUID="28c07a342a30c0e354482d7284dcbb2c" Apr 16 18:31:20.537266 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:20.537176 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e6cc808ab7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal,UID:28c07a342a30c0e354482d7284dcbb2c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_openshift-machine-config-operator(28c07a342a30c0e354482d7284dcbb2c),Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:20.526965431 +0000 UTC m=+6.651493087,LastTimestamp:2026-04-16 18:31:20.526965431 +0000 UTC m=+6.651493087,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:20.611088 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:20.611055 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="6.4s" Apr 16 18:31:20.861434 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.861347 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:20.862468 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.862451 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:20.862547 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.862484 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:20.862547 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.862494 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:20.862547 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:20.862522 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:20.882485 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:20.882449 2576 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:21.366854 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:21.366829 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:21.528876 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:21.528852 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/1.log" Apr 16 18:31:21.529254 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:21.529240 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:21.530125 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:21.530108 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:21.530181 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:21.530139 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:21.530181 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:21.530148 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:21.530374 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:21.530357 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-154.ec2.internal\" not found" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:21.530433 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:21.530404 2576 scope.go:117] "RemoveContainer" containerID="2d42a36a17593f58ffa6056be36111e52abd8d73aa7f527ca5a91e79b9b1338a" Apr 16 18:31:21.530537 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:21.530521 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_openshift-machine-config-operator(28c07a342a30c0e354482d7284dcbb2c)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" podUID="28c07a342a30c0e354482d7284dcbb2c" Apr 16 18:31:21.539202 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:21.539118 2576 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e6cc808ab7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal.18a6e9e6cc808ab7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal,UID:28c07a342a30c0e354482d7284dcbb2c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_openshift-machine-config-operator(28c07a342a30c0e354482d7284dcbb2c),Source:EventSource{Component:kubelet,Host:ip-10-0-140-154.ec2.internal,},FirstTimestamp:2026-04-16 18:31:20.526965431 +0000 UTC m=+6.651493087,LastTimestamp:2026-04-16 18:31:21.53049363 +0000 UTC m=+7.655021283,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-154.ec2.internal,}" Apr 16 18:31:22.368410 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:22.368373 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:22.808162 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:22.808076 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:31:23.367164 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:23.367137 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:23.788459 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:23.788370 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 16 18:31:23.963982 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:23.963956 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:31:24.368088 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:24.368060 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:24.432474 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:24.432425 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:25.369130 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:25.369096 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:26.368850 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:26.368818 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:27.021451 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:27.021414 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 16 18:31:27.283420 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:27.283335 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:27.285141 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:27.285123 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:27.285249 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:27.285158 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:27.285249 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:27.285172 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:27.285249 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:27.285209 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:27.306235 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:27.306198 2576 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-140-154.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:27.367957 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:27.367924 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:28.369800 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:28.369764 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:29.368578 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:29.368548 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:30.367127 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:30.367093 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:31.368069 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:31.368034 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:32.371840 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:32.371808 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-154.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:31:32.381579 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:32.381554 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mfx58" Apr 16 18:31:32.930158 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:32.930130 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:31:32.987460 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:32.987431 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:31:33.193167 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.193088 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:31:33.267208 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.267178 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:31:33.267345 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.267327 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:31:33.267391 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.267349 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:31:33.267391 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.267350 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:31:33.353707 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.353663 2576 apiserver.go:52] "Watching apiserver" Apr 16 18:31:33.365804 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.365776 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:31:33.365952 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.365851 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=[] Apr 16 18:31:33.370180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.370162 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-154.ec2.internal" not found Apr 16 18:31:33.382545 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.382509 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:26:32 +0000 UTC" deadline="2027-11-27 09:24:50.68218983 +0000 UTC" Apr 16 18:31:33.382545 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.382543 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14150h53m17.29965115s" Apr 16 18:31:33.387335 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.387315 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-154.ec2.internal" not found Apr 16 18:31:33.447107 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.447035 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-154.ec2.internal" not found Apr 16 18:31:33.465428 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.465401 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:31:33.501708 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.501688 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:33.502708 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.502686 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:33.502825 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.502727 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:33.502825 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.502757 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:33.503044 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:33.503028 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-154.ec2.internal\" not found" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:33.503107 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.503091 2576 scope.go:117] "RemoveContainer" containerID="2d42a36a17593f58ffa6056be36111e52abd8d73aa7f527ca5a91e79b9b1338a" Apr 16 18:31:33.641956 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.641924 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:31:33.724550 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.724467 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-154.ec2.internal" not found Apr 16 18:31:33.724550 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:33.724495 2576 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ip-10-0-140-154.ec2.internal" not found Apr 16 18:31:33.761164 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.761143 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-154.ec2.internal" not found Apr 16 18:31:33.780690 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.780665 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-154.ec2.internal" not found Apr 16 18:31:33.837261 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:33.837240 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-154.ec2.internal" not found Apr 16 18:31:34.028277 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.028226 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-140-154.ec2.internal\" not found" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:34.105693 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.105668 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-154.ec2.internal" not found Apr 16 18:31:34.105693 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.105693 2576 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ip-10-0-140-154.ec2.internal" not found Apr 16 18:31:34.307183 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.307100 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.308140 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.308117 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.308267 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.308150 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.308267 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.308160 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.308267 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.308190 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:34.317695 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.317673 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:34.317695 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.317698 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-154.ec2.internal\": node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:34.348299 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.348269 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:34.375056 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.375033 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:31:34.394512 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.394483 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-q67qs"] Apr 16 18:31:34.395191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.395177 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.396118 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.396100 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.396196 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.396130 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.396196 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.396145 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.397323 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.397308 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8pbpt"] Apr 16 18:31:34.397392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.397382 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.397461 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.397446 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:34.397496 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.397482 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.398189 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.398174 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.398245 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.398206 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.398245 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.398218 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.398298 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.398174 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.398298 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.398286 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.398362 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.398298 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.398362 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.398308 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:31:34.398425 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.398373 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:34.399354 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.399336 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8pbpt" Apr 16 18:31:34.399446 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.399367 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.400098 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.400082 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.400160 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.400108 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.400160 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.400119 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.402902 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.402885 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:31:34.403100 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.403086 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:31:34.406998 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.406980 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-dxrxn\"" Apr 16 18:31:34.432670 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.432635 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:34.444629 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.444593 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-cjvhl" Apr 16 18:31:34.448957 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.448933 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:34.449938 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.449923 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-cl74p"] Apr 16 18:31:34.450029 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.450020 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.450961 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.450944 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.451061 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.450973 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.451061 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.450989 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.452337 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.452323 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:34.452413 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.452352 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.453195 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.453170 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.453195 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.453199 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.453312 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.453213 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.453312 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.453277 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:34.457308 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.457290 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-cjvhl" Apr 16 18:31:34.468964 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.468934 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-hpgn4"] Apr 16 18:31:34.469090 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.469044 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.469922 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.469893 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.470004 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.469930 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.470004 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.469943 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.471335 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.471320 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hpgn4" Apr 16 18:31:34.471383 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.471349 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.472056 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.472041 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.472155 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.472066 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.472155 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.472076 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.475053 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.475032 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:31:34.475124 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.475032 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-bp4wk\"" Apr 16 18:31:34.475410 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.475396 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:31:34.475571 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.475556 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:31:34.484648 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.484629 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bnn4v"] Apr 16 18:31:34.484760 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.484726 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.485565 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.485552 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.485609 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.485579 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.485609 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.485591 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.487026 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.487013 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bnn4v" Apr 16 18:31:34.487064 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.487039 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.487800 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.487784 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.487888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.487808 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.487888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.487819 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.490370 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.490357 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-9rx88\"" Apr 16 18:31:34.490868 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.490841 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:31:34.490954 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.490874 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:31:34.490954 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.490875 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:31:34.498724 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.497309 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-wn2l4"] Apr 16 18:31:34.498724 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.497868 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.499166 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.499150 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.499212 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.499178 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.499212 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.499193 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.500526 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.500512 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.500591 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.500542 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.501231 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.501217 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.501281 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.501247 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.501335 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.501295 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.504399 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.504380 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:31:34.505107 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.505094 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:31:34.505310 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.505296 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:31:34.505371 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.505314 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:31:34.505652 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.505636 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mzmbh\"" Apr 16 18:31:34.505842 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.505827 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:31:34.549370 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.549339 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:34.550255 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.550236 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/2.log" Apr 16 18:31:34.550644 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.550627 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/1.log" Apr 16 18:31:34.550966 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.550946 2576 generic.go:358] "Generic (PLEG): container finished" podID="28c07a342a30c0e354482d7284dcbb2c" containerID="284fd67504784abbd9ab23c1bd3a8ecf717c8e781eb54c699fc73d346f64a03d" exitCode=1 Apr 16 18:31:34.551025 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.550981 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" event={"ID":"28c07a342a30c0e354482d7284dcbb2c","Type":"ContainerDied","Data":"284fd67504784abbd9ab23c1bd3a8ecf717c8e781eb54c699fc73d346f64a03d"} Apr 16 18:31:34.551025 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.551009 2576 scope.go:117] "RemoveContainer" containerID="2d42a36a17593f58ffa6056be36111e52abd8d73aa7f527ca5a91e79b9b1338a" Apr 16 18:31:34.551144 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.551119 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.552158 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.552056 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.552158 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.552095 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.552158 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.552106 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.552338 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.552324 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-154.ec2.internal\" not found" node="ip-10-0-140-154.ec2.internal" Apr 16 18:31:34.552393 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.552373 2576 scope.go:117] "RemoveContainer" containerID="284fd67504784abbd9ab23c1bd3a8ecf717c8e781eb54c699fc73d346f64a03d" Apr 16 18:31:34.552531 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.552515 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_openshift-machine-config-operator(28c07a342a30c0e354482d7284dcbb2c)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" podUID="28c07a342a30c0e354482d7284dcbb2c" Apr 16 18:31:34.554614 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.554591 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-78v92"] Apr 16 18:31:34.554979 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.554966 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.555768 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.555733 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.555839 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.555784 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.555839 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.555799 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.557169 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.557154 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.557264 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.557190 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.557534 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.557515 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-s46p6"] Apr 16 18:31:34.557699 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.557688 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.557991 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.557974 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.558059 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.558007 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.558059 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.558023 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.558505 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.558483 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.558666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.558516 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.558666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.558530 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.560005 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.559989 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.560060 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.560024 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.560779 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.560762 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.560857 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.560792 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.560857 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.560808 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.562450 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.562429 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:31:34.563125 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.563087 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:31:34.563125 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.563102 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-v56sb\"" Apr 16 18:31:34.563271 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.563164 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:31:34.563271 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.563103 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:31:34.563271 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.563104 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:31:34.563271 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.563087 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:31:34.565667 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.565648 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-s267k\"" Apr 16 18:31:34.565833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.565687 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:31:34.565833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.565649 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:31:34.567057 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.567037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlkrx\" (UniqueName: \"kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx\") pod \"network-check-target-q67qs\" (UID: \"5f3c3f50-8ee2-4775-a3de-64e723a55361\") " pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:34.567104 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.567073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/412ee020-08a8-48fa-a1be-e9b5ca0d1cb5-tmp-dir\") pod \"node-resolver-8pbpt\" (UID: \"412ee020-08a8-48fa-a1be-e9b5ca0d1cb5\") " pod="openshift-dns/node-resolver-8pbpt" Apr 16 18:31:34.567139 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.567101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwdpr\" (UniqueName: \"kubernetes.io/projected/412ee020-08a8-48fa-a1be-e9b5ca0d1cb5-kube-api-access-dwdpr\") pod \"node-resolver-8pbpt\" (UID: \"412ee020-08a8-48fa-a1be-e9b5ca0d1cb5\") " pod="openshift-dns/node-resolver-8pbpt" Apr 16 18:31:34.567139 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.567130 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/605560b8-ea93-4ad3-9510-324082bdc13f-iptables-alerter-script\") pod \"iptables-alerter-hpgn4\" (UID: \"605560b8-ea93-4ad3-9510-324082bdc13f\") " pod="openshift-network-operator/iptables-alerter-hpgn4" Apr 16 18:31:34.567228 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.567152 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2q9p\" (UniqueName: \"kubernetes.io/projected/605560b8-ea93-4ad3-9510-324082bdc13f-kube-api-access-g2q9p\") pod \"iptables-alerter-hpgn4\" (UID: \"605560b8-ea93-4ad3-9510-324082bdc13f\") " pod="openshift-network-operator/iptables-alerter-hpgn4" Apr 16 18:31:34.567228 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.567169 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/412ee020-08a8-48fa-a1be-e9b5ca0d1cb5-hosts-file\") pod \"node-resolver-8pbpt\" (UID: \"412ee020-08a8-48fa-a1be-e9b5ca0d1cb5\") " pod="openshift-dns/node-resolver-8pbpt" Apr 16 18:31:34.567293 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.567224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs\") pod \"network-metrics-daemon-cl74p\" (UID: \"697ddfb3-adc9-4a63-b5ca-b4b871946a33\") " pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:34.567293 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.567256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xbzw\" (UniqueName: \"kubernetes.io/projected/697ddfb3-adc9-4a63-b5ca-b4b871946a33-kube-api-access-8xbzw\") pod \"network-metrics-daemon-cl74p\" (UID: \"697ddfb3-adc9-4a63-b5ca-b4b871946a33\") " pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:34.567293 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.567275 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/605560b8-ea93-4ad3-9510-324082bdc13f-host-slash\") pod \"iptables-alerter-hpgn4\" (UID: \"605560b8-ea93-4ad3-9510-324082bdc13f\") " pod="openshift-network-operator/iptables-alerter-hpgn4" Apr 16 18:31:34.605300 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.605264 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-jktqd"] Apr 16 18:31:34.605458 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.605447 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.606458 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.606440 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.606545 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.606479 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.606545 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.606493 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.608351 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.608331 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.608461 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.608368 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.609159 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.609137 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.609244 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.609167 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.609244 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.609180 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.612614 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.612593 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-sqt88\"" Apr 16 18:31:34.612708 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.612630 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:31:34.648944 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.648913 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-p4gk6"] Apr 16 18:31:34.649162 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.649147 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.649564 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.649545 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:34.650216 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.650199 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.650303 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.650230 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.650303 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.650240 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.651694 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.651679 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p4gk6" Apr 16 18:31:34.651735 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.651708 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.654066 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.652759 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.654066 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.652803 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.654066 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.652819 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.654066 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.653774 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh"] Apr 16 18:31:34.654315 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.654305 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.655644 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.655599 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.655644 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.655632 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.655644 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.655646 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.655925 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.655708 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:31:34.656551 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.656536 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:31:34.656675 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.656664 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xs87t\"" Apr 16 18:31:34.657318 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.657302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.657406 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.657334 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.658131 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.658115 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.658195 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.658149 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.658195 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.658165 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.660819 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.660802 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:31:34.660932 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.660916 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:31:34.660994 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.660985 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jmdtp\"" Apr 16 18:31:34.661244 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.661230 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:31:34.667504 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-sysconfig\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.667608 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667511 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-modprobe-d\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.667608 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-systemd\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.667608 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-run-ovn\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.667608 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.667785 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667633 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-lib-modules\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.667785 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xbzw\" (UniqueName: \"kubernetes.io/projected/697ddfb3-adc9-4a63-b5ca-b4b871946a33-kube-api-access-8xbzw\") pod \"network-metrics-daemon-cl74p\" (UID: \"697ddfb3-adc9-4a63-b5ca-b4b871946a33\") " pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:34.667785 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667703 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-var-lib-kubelet\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.667785 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9e01863-f0f0-4a5e-935f-50699365f569-ovnkube-config\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.667785 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667762 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.667972 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667785 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9e01863-f0f0-4a5e-935f-50699365f569-env-overrides\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.667972 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9e01863-f0f0-4a5e-935f-50699365f569-ovn-node-metrics-cert\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.667972 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667859 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-run\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.667972 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-systemd-units\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.667972 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667908 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-run-openvswitch\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.667972 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/605560b8-ea93-4ad3-9510-324082bdc13f-iptables-alerter-script\") pod \"iptables-alerter-hpgn4\" (UID: \"605560b8-ea93-4ad3-9510-324082bdc13f\") " pod="openshift-network-operator/iptables-alerter-hpgn4" Apr 16 18:31:34.667972 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-sysctl-conf\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.667972 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667964 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-sys\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.668272 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.667985 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-run-netns\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.668272 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5b4ead4a-0df1-4977-9c23-9828175001c0-serviceca\") pod \"node-ca-bnn4v\" (UID: \"5b4ead4a-0df1-4977-9c23-9828175001c0\") " pod="openshift-image-registry/node-ca-bnn4v" Apr 16 18:31:34.668272 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-host\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.668272 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668043 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.668272 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668061 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-etc-openvswitch\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.668272 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668133 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vftlg\" (UniqueName: \"kubernetes.io/projected/5b4ead4a-0df1-4977-9c23-9828175001c0-kube-api-access-vftlg\") pod \"node-ca-bnn4v\" (UID: \"5b4ead4a-0df1-4977-9c23-9828175001c0\") " pod="openshift-image-registry/node-ca-bnn4v" Apr 16 18:31:34.668272 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-system-cni-dir\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.668272 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668208 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-os-release\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.668272 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668237 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/605560b8-ea93-4ad3-9510-324082bdc13f-host-slash\") pod \"iptables-alerter-hpgn4\" (UID: \"605560b8-ea93-4ad3-9510-324082bdc13f\") " pod="openshift-network-operator/iptables-alerter-hpgn4" Apr 16 18:31:34.668272 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668260 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-kubelet\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-slash\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668303 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/605560b8-ea93-4ad3-9510-324082bdc13f-host-slash\") pod \"iptables-alerter-hpgn4\" (UID: \"605560b8-ea93-4ad3-9510-324082bdc13f\") " pod="openshift-network-operator/iptables-alerter-hpgn4" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668331 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-run-systemd\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-log-socket\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-cni-netd\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8v6k\" (UniqueName: \"kubernetes.io/projected/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-kube-api-access-l8v6k\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/605560b8-ea93-4ad3-9510-324082bdc13f-iptables-alerter-script\") pod \"iptables-alerter-hpgn4\" (UID: \"605560b8-ea93-4ad3-9510-324082bdc13f\") " pod="openshift-network-operator/iptables-alerter-hpgn4" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/412ee020-08a8-48fa-a1be-e9b5ca0d1cb5-tmp-dir\") pod \"node-resolver-8pbpt\" (UID: \"412ee020-08a8-48fa-a1be-e9b5ca0d1cb5\") " pod="openshift-dns/node-resolver-8pbpt" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/10b515ec-8868-47c8-b336-6809d8756b3b-etc-tuned\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668485 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10b515ec-8868-47c8-b336-6809d8756b3b-tmp\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-var-lib-openvswitch\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668529 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-run-ovn-kubernetes\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9rv9\" (UniqueName: \"kubernetes.io/projected/d9e01863-f0f0-4a5e-935f-50699365f569-kube-api-access-b9rv9\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668595 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-cni-binary-copy\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2q9p\" (UniqueName: \"kubernetes.io/projected/605560b8-ea93-4ad3-9510-324082bdc13f-kube-api-access-g2q9p\") pod \"iptables-alerter-hpgn4\" (UID: \"605560b8-ea93-4ad3-9510-324082bdc13f\") " pod="openshift-network-operator/iptables-alerter-hpgn4" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/412ee020-08a8-48fa-a1be-e9b5ca0d1cb5-hosts-file\") pod \"node-resolver-8pbpt\" (UID: \"412ee020-08a8-48fa-a1be-e9b5ca0d1cb5\") " pod="openshift-dns/node-resolver-8pbpt" Apr 16 18:31:34.668697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-kubernetes\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668692 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-node-log\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668720 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/412ee020-08a8-48fa-a1be-e9b5ca0d1cb5-hosts-file\") pod \"node-resolver-8pbpt\" (UID: \"412ee020-08a8-48fa-a1be-e9b5ca0d1cb5\") " pod="openshift-dns/node-resolver-8pbpt" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/412ee020-08a8-48fa-a1be-e9b5ca0d1cb5-tmp-dir\") pod \"node-resolver-8pbpt\" (UID: \"412ee020-08a8-48fa-a1be-e9b5ca0d1cb5\") " pod="openshift-dns/node-resolver-8pbpt" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668761 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668775 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668819 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlkrx\" (UniqueName: \"kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx\") pod \"network-check-target-q67qs\" (UID: \"5f3c3f50-8ee2-4775-a3de-64e723a55361\") " pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-sysctl-d\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-cni-bin\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b4ead4a-0df1-4977-9c23-9828175001c0-host\") pod \"node-ca-bnn4v\" (UID: \"5b4ead4a-0df1-4977-9c23-9828175001c0\") " pod="openshift-image-registry/node-ca-bnn4v" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668982 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-cnibin\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.668998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwdpr\" (UniqueName: \"kubernetes.io/projected/412ee020-08a8-48fa-a1be-e9b5ca0d1cb5-kube-api-access-dwdpr\") pod \"node-resolver-8pbpt\" (UID: \"412ee020-08a8-48fa-a1be-e9b5ca0d1cb5\") " pod="openshift-dns/node-resolver-8pbpt" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669057 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs\") pod \"network-metrics-daemon-cl74p\" (UID: \"697ddfb3-adc9-4a63-b5ca-b4b871946a33\") " pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669068 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669077 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9e01863-f0f0-4a5e-935f-50699365f569-ovnkube-script-lib\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669092 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffjl2\" (UniqueName: \"kubernetes.io/projected/10b515ec-8868-47c8-b336-6809d8756b3b-kube-api-access-ffjl2\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.669453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669135 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669581 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669605 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669618 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669587 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669651 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669585 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669693 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669708 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669669 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669897 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669912 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.669920 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.670008 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:34.670191 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.670068 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs podName:697ddfb3-adc9-4a63-b5ca-b4b871946a33 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:35.170045314 +0000 UTC m=+21.294573175 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs") pod "network-metrics-daemon-cl74p" (UID: "697ddfb3-adc9-4a63-b5ca-b4b871946a33") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:34.682092 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.682068 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:31:34.682092 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.682090 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:31:34.682216 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.682099 2576 projected.go:194] Error preparing data for projected volume kube-api-access-wlkrx for pod openshift-network-diagnostics/network-check-target-q67qs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:34.682216 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.682156 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx podName:5f3c3f50-8ee2-4775-a3de-64e723a55361 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:35.182137423 +0000 UTC m=+21.306665066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wlkrx" (UniqueName: "kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx") pod "network-check-target-q67qs" (UID: "5f3c3f50-8ee2-4775-a3de-64e723a55361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:34.682342 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.682298 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:31:34.685717 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.685689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwdpr\" (UniqueName: \"kubernetes.io/projected/412ee020-08a8-48fa-a1be-e9b5ca0d1cb5-kube-api-access-dwdpr\") pod \"node-resolver-8pbpt\" (UID: \"412ee020-08a8-48fa-a1be-e9b5ca0d1cb5\") " pod="openshift-dns/node-resolver-8pbpt" Apr 16 18:31:34.685831 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.685687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2q9p\" (UniqueName: \"kubernetes.io/projected/605560b8-ea93-4ad3-9510-324082bdc13f-kube-api-access-g2q9p\") pod \"iptables-alerter-hpgn4\" (UID: \"605560b8-ea93-4ad3-9510-324082bdc13f\") " pod="openshift-network-operator/iptables-alerter-hpgn4" Apr 16 18:31:34.685831 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.685732 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xbzw\" (UniqueName: \"kubernetes.io/projected/697ddfb3-adc9-4a63-b5ca-b4b871946a33-kube-api-access-8xbzw\") pod \"network-metrics-daemon-cl74p\" (UID: \"697ddfb3-adc9-4a63-b5ca-b4b871946a33\") " pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:34.708037 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.708006 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8pbpt" Apr 16 18:31:34.714111 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:34.714085 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod412ee020_08a8_48fa_a1be_e9b5ca0d1cb5.slice/crio-b8a12eab108ab6b96ea80d2f6b5b84b5babd83c8ef7df4b2d1da487e83dded20 WatchSource:0}: Error finding container b8a12eab108ab6b96ea80d2f6b5b84b5babd83c8ef7df4b2d1da487e83dded20: Status 404 returned error can't find the container with id b8a12eab108ab6b96ea80d2f6b5b84b5babd83c8ef7df4b2d1da487e83dded20 Apr 16 18:31:34.749673 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.749621 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:34.770043 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770023 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-hostroot\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.770150 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b4ead4a-0df1-4977-9c23-9828175001c0-host\") pod \"node-ca-bnn4v\" (UID: \"5b4ead4a-0df1-4977-9c23-9828175001c0\") " pod="openshift-image-registry/node-ca-bnn4v" Apr 16 18:31:34.770150 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-cnibin\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.770150 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.770150 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-run-k8s-cni-cncf-io\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.770150 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770131 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbffm\" (UniqueName: \"kubernetes.io/projected/89c089f0-109d-41b7-8411-4debfff14a91-kube-api-access-dbffm\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.770301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770160 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b4ead4a-0df1-4977-9c23-9828175001c0-host\") pod \"node-ca-bnn4v\" (UID: \"5b4ead4a-0df1-4977-9c23-9828175001c0\") " pod="openshift-image-registry/node-ca-bnn4v" Apr 16 18:31:34.770301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770182 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-cnibin\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.770301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9e01863-f0f0-4a5e-935f-50699365f569-ovnkube-script-lib\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.770301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770223 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-os-release\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.770301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770241 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-multus-socket-dir-parent\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.770301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770258 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-sys-fs\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.770301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffjl2\" (UniqueName: \"kubernetes.io/projected/10b515ec-8868-47c8-b336-6809d8756b3b-kube-api-access-ffjl2\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.770301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-etc-kubernetes\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.770621 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-sysconfig\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.770621 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-modprobe-d\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.770621 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-systemd\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.770621 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770385 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.770621 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770407 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-systemd\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.770621 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-sysconfig\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.770621 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-run-ovn\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.770621 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770507 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.770621 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770508 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-run-ovn\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.770621 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770537 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-var-lib-cni-multus\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.770621 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-modprobe-d\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.770621 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770599 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770636 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-lib-modules\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-var-lib-kubelet\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9e01863-f0f0-4a5e-935f-50699365f569-ovnkube-config\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9e01863-f0f0-4a5e-935f-50699365f569-env-overrides\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770727 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-lib-modules\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-var-lib-kubelet\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9e01863-f0f0-4a5e-935f-50699365f569-ovn-node-metrics-cert\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-etc-selinux\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-run\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-systemd-units\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9e01863-f0f0-4a5e-935f-50699365f569-ovnkube-script-lib\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-run-openvswitch\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-sysctl-conf\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770929 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-sys\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-systemd-units\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.771070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-run-netns\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-run-openvswitch\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5b4ead4a-0df1-4977-9c23-9828175001c0-serviceca\") pod \"node-ca-bnn4v\" (UID: \"5b4ead4a-0df1-4977-9c23-9828175001c0\") " pod="openshift-image-registry/node-ca-bnn4v" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-run-netns\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771026 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a2547dac-9955-4de0-ba29-8ca57b537b69-cni-binary-copy\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771051 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/389b3d12-4b63-4bc3-9047-fb35ce314e95-konnectivity-ca\") pod \"konnectivity-agent-p4gk6\" (UID: \"389b3d12-4b63-4bc3-9047-fb35ce314e95\") " pod="kube-system/konnectivity-agent-p4gk6" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-sys\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-sysctl-conf\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-host\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-host\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-run-multus-certs\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.770999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-run\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771167 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9bpz\" (UniqueName: \"kubernetes.io/projected/a2547dac-9955-4de0-ba29-8ca57b537b69-kube-api-access-p9bpz\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-device-dir\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-etc-openvswitch\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.771897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vftlg\" (UniqueName: \"kubernetes.io/projected/5b4ead4a-0df1-4977-9c23-9828175001c0-kube-api-access-vftlg\") pod \"node-ca-bnn4v\" (UID: \"5b4ead4a-0df1-4977-9c23-9828175001c0\") " pod="openshift-image-registry/node-ca-bnn4v" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-etc-openvswitch\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771348 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-system-cni-dir\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-system-cni-dir\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-os-release\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-kubelet\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-slash\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-run-systemd\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-log-socket\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-multus-cni-dir\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771550 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-kubelet\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-cnibin\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-cni-netd\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771589 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-slash\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771603 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-log-socket\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-os-release\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771618 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5b4ead4a-0df1-4977-9c23-9828175001c0-serviceca\") pod \"node-ca-bnn4v\" (UID: \"5b4ead4a-0df1-4977-9c23-9828175001c0\") " pod="openshift-image-registry/node-ca-bnn4v" Apr 16 18:31:34.772592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771609 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8v6k\" (UniqueName: \"kubernetes.io/projected/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-kube-api-access-l8v6k\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771642 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-run-systemd\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-cni-netd\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-run-netns\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771697 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771705 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771712 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a2547dac-9955-4de0-ba29-8ca57b537b69-multus-daemon-config\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771723 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/389b3d12-4b63-4bc3-9047-fb35ce314e95-agent-certs\") pod \"konnectivity-agent-p4gk6\" (UID: \"389b3d12-4b63-4bc3-9047-fb35ce314e95\") " pod="kube-system/konnectivity-agent-p4gk6" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771697 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-registration-dir\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771736 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771793 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771795 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/10b515ec-8868-47c8-b336-6809d8756b3b-etc-tuned\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771818 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771808 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10b515ec-8868-47c8-b336-6809d8756b3b-tmp\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-var-lib-openvswitch\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771899 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-run-ovn-kubernetes\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9rv9\" (UniqueName: \"kubernetes.io/projected/d9e01863-f0f0-4a5e-935f-50699365f569-kube-api-access-b9rv9\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.773329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-run-ovn-kubernetes\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-var-lib-openvswitch\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.771955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-cni-binary-copy\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772007 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-var-lib-kubelet\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-kubernetes\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-cni-binary-copy\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-node-log\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-system-cni-dir\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-var-lib-cni-bin\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-kubernetes\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772492 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-multus-conf-dir\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-socket-dir\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-node-log\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-sysctl-d\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-cni-bin\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772663 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-cni-bin\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.774095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9e01863-f0f0-4a5e-935f-50699365f569-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/10b515ec-8868-47c8-b336-6809d8756b3b-etc-sysctl-d\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9e01863-f0f0-4a5e-935f-50699365f569-ovnkube-config\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.772868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9e01863-f0f0-4a5e-935f-50699365f569-env-overrides\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773136 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773167 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773180 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773280 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773298 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773312 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773348 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773375 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773355 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773391 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773407 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773420 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773310 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773473 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.773489 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.774888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.774593 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9e01863-f0f0-4a5e-935f-50699365f569-ovn-node-metrics-cert\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.775845 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.775827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10b515ec-8868-47c8-b336-6809d8756b3b-tmp\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.775907 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.775866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/10b515ec-8868-47c8-b336-6809d8756b3b-etc-tuned\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.780033 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.780014 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hpgn4" Apr 16 18:31:34.780652 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.780572 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffjl2\" (UniqueName: \"kubernetes.io/projected/10b515ec-8868-47c8-b336-6809d8756b3b-kube-api-access-ffjl2\") pod \"tuned-s46p6\" (UID: \"10b515ec-8868-47c8-b336-6809d8756b3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.783575 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.783549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vftlg\" (UniqueName: \"kubernetes.io/projected/5b4ead4a-0df1-4977-9c23-9828175001c0-kube-api-access-vftlg\") pod \"node-ca-bnn4v\" (UID: \"5b4ead4a-0df1-4977-9c23-9828175001c0\") " pod="openshift-image-registry/node-ca-bnn4v" Apr 16 18:31:34.783801 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.783615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8v6k\" (UniqueName: \"kubernetes.io/projected/48a38ebc-2033-4b86-99f7-c22d4b6e6ccc-kube-api-access-l8v6k\") pod \"multus-additional-cni-plugins-wn2l4\" (UID: \"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc\") " pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.785079 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.785047 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9rv9\" (UniqueName: \"kubernetes.io/projected/d9e01863-f0f0-4a5e-935f-50699365f569-kube-api-access-b9rv9\") pod \"ovnkube-node-78v92\" (UID: \"d9e01863-f0f0-4a5e-935f-50699365f569\") " pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.786174 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:34.786152 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod605560b8_ea93_4ad3_9510_324082bdc13f.slice/crio-9becad72b8ed0d87a375ae1313228d2d53e022f803344d3eb679f1900d4c4cfb WatchSource:0}: Error finding container 9becad72b8ed0d87a375ae1313228d2d53e022f803344d3eb679f1900d4c4cfb: Status 404 returned error can't find the container with id 9becad72b8ed0d87a375ae1313228d2d53e022f803344d3eb679f1900d4c4cfb Apr 16 18:31:34.796542 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.796522 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bnn4v" Apr 16 18:31:34.802153 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:34.802125 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b4ead4a_0df1_4977_9c23_9828175001c0.slice/crio-8fe4ee10d4998a7a04002803523f84dcbd12a7618776b06bbd8d23a85065a800 WatchSource:0}: Error finding container 8fe4ee10d4998a7a04002803523f84dcbd12a7618776b06bbd8d23a85065a800: Status 404 returned error can't find the container with id 8fe4ee10d4998a7a04002803523f84dcbd12a7618776b06bbd8d23a85065a800 Apr 16 18:31:34.809004 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.808957 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wn2l4" Apr 16 18:31:34.814967 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:34.814940 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48a38ebc_2033_4b86_99f7_c22d4b6e6ccc.slice/crio-17e5da28a602f4cc6a06d15a2cd275b2d26430a8e69d26c2192b5a0cc55db6eb WatchSource:0}: Error finding container 17e5da28a602f4cc6a06d15a2cd275b2d26430a8e69d26c2192b5a0cc55db6eb: Status 404 returned error can't find the container with id 17e5da28a602f4cc6a06d15a2cd275b2d26430a8e69d26c2192b5a0cc55db6eb Apr 16 18:31:34.850348 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.850312 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:34.867735 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.867712 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:34.872481 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.872454 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s46p6" Apr 16 18:31:34.872937 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.872917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-var-lib-cni-multus\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873003 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.872947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.873003 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.872982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-etc-selinux\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.873188 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a2547dac-9955-4de0-ba29-8ca57b537b69-cni-binary-copy\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873188 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873013 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-var-lib-cni-multus\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873188 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/389b3d12-4b63-4bc3-9047-fb35ce314e95-konnectivity-ca\") pod \"konnectivity-agent-p4gk6\" (UID: \"389b3d12-4b63-4bc3-9047-fb35ce314e95\") " pod="kube-system/konnectivity-agent-p4gk6" Apr 16 18:31:34.873188 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-run-multus-certs\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873188 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9bpz\" (UniqueName: \"kubernetes.io/projected/a2547dac-9955-4de0-ba29-8ca57b537b69-kube-api-access-p9bpz\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873188 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873117 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.873188 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-device-dir\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.873188 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-etc-selinux\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.873188 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-run-multus-certs\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873188 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-multus-cni-dir\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-cnibin\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873223 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-run-netns\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a2547dac-9955-4de0-ba29-8ca57b537b69-multus-daemon-config\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873276 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-device-dir\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/389b3d12-4b63-4bc3-9047-fb35ce314e95-agent-certs\") pod \"konnectivity-agent-p4gk6\" (UID: \"389b3d12-4b63-4bc3-9047-fb35ce314e95\") " pod="kube-system/konnectivity-agent-p4gk6" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-registration-dir\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-run-netns\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-var-lib-kubelet\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-system-cni-dir\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-var-lib-cni-bin\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-multus-conf-dir\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-socket-dir\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-hostroot\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873518 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-multus-cni-dir\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-run-k8s-cni-cncf-io\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/389b3d12-4b63-4bc3-9047-fb35ce314e95-konnectivity-ca\") pod \"konnectivity-agent-p4gk6\" (UID: \"389b3d12-4b63-4bc3-9047-fb35ce314e95\") " pod="kube-system/konnectivity-agent-p4gk6" Apr 16 18:31:34.873666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbffm\" (UniqueName: \"kubernetes.io/projected/89c089f0-109d-41b7-8411-4debfff14a91-kube-api-access-dbffm\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873622 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-system-cni-dir\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-os-release\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a2547dac-9955-4de0-ba29-8ca57b537b69-cni-binary-copy\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-multus-socket-dir-parent\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-registration-dir\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-var-lib-cni-bin\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-sys-fs\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-multus-conf-dir\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-run-k8s-cni-cncf-io\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873581 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-host-var-lib-kubelet\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873766 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-sys-fs\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-os-release\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-cnibin\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873824 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873835 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-etc-kubernetes\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-hostroot\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873910 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-multus-socket-dir-parent\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873775 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2547dac-9955-4de0-ba29-8ca57b537b69-etc-kubernetes\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.874492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.873995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/89c089f0-109d-41b7-8411-4debfff14a91-socket-dir\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.875278 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.874301 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.875278 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.874334 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.875278 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.874347 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.875278 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.874301 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.875278 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.874393 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.875278 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.874409 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.875278 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.874487 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a2547dac-9955-4de0-ba29-8ca57b537b69-multus-daemon-config\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.875278 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.874588 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:34.875278 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.874611 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:34.875278 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.874626 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:34.875723 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:34.875679 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9e01863_f0f0_4a5e_935f_50699365f569.slice/crio-5c22e39bdcd2d6a553e667da7f9fea33f1d0a60521ef8df0baea9ae86531a888 WatchSource:0}: Error finding container 5c22e39bdcd2d6a553e667da7f9fea33f1d0a60521ef8df0baea9ae86531a888: Status 404 returned error can't find the container with id 5c22e39bdcd2d6a553e667da7f9fea33f1d0a60521ef8df0baea9ae86531a888 Apr 16 18:31:34.877050 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.876964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/389b3d12-4b63-4bc3-9047-fb35ce314e95-agent-certs\") pod \"konnectivity-agent-p4gk6\" (UID: \"389b3d12-4b63-4bc3-9047-fb35ce314e95\") " pod="kube-system/konnectivity-agent-p4gk6" Apr 16 18:31:34.881062 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:34.881043 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10b515ec_8868_47c8_b336_6809d8756b3b.slice/crio-3915dc4944facbfd65f0430fb0c09f17d90616f2ee4160d88dd5b8aceaa7e9f1 WatchSource:0}: Error finding container 3915dc4944facbfd65f0430fb0c09f17d90616f2ee4160d88dd5b8aceaa7e9f1: Status 404 returned error can't find the container with id 3915dc4944facbfd65f0430fb0c09f17d90616f2ee4160d88dd5b8aceaa7e9f1 Apr 16 18:31:34.884543 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.884524 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9bpz\" (UniqueName: \"kubernetes.io/projected/a2547dac-9955-4de0-ba29-8ca57b537b69-kube-api-access-p9bpz\") pod \"multus-jktqd\" (UID: \"a2547dac-9955-4de0-ba29-8ca57b537b69\") " pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.884713 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.884696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbffm\" (UniqueName: \"kubernetes.io/projected/89c089f0-109d-41b7-8411-4debfff14a91-kube-api-access-dbffm\") pod \"aws-ebs-csi-driver-node-mn5nh\" (UID: \"89c089f0-109d-41b7-8411-4debfff14a91\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.916872 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.916839 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jktqd" Apr 16 18:31:34.923725 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:34.923696 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2547dac_9955_4de0_ba29_8ca57b537b69.slice/crio-a63c37119c4cf072c71cf9d07633f6c70470c8b2e388516381fbcb991ade66b5 WatchSource:0}: Error finding container a63c37119c4cf072c71cf9d07633f6c70470c8b2e388516381fbcb991ade66b5: Status 404 returned error can't find the container with id a63c37119c4cf072c71cf9d07633f6c70470c8b2e388516381fbcb991ade66b5 Apr 16 18:31:34.950879 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:34.950842 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:34.965107 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.965078 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p4gk6" Apr 16 18:31:34.967896 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:34.967878 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" Apr 16 18:31:34.973164 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:34.973137 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod389b3d12_4b63_4bc3_9047_fb35ce314e95.slice/crio-d428b255e26cc9964233a2c51d02c7c6879b01ecb176c89270a020e91de5b262 WatchSource:0}: Error finding container d428b255e26cc9964233a2c51d02c7c6879b01ecb176c89270a020e91de5b262: Status 404 returned error can't find the container with id d428b255e26cc9964233a2c51d02c7c6879b01ecb176c89270a020e91de5b262 Apr 16 18:31:34.975813 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:34.975793 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89c089f0_109d_41b7_8411_4debfff14a91.slice/crio-de8685428ec0474bd6dafd8bf74183c6df7fc389914e6b4d7b0839405365b734 WatchSource:0}: Error finding container de8685428ec0474bd6dafd8bf74183c6df7fc389914e6b4d7b0839405365b734: Status 404 returned error can't find the container with id de8685428ec0474bd6dafd8bf74183c6df7fc389914e6b4d7b0839405365b734 Apr 16 18:31:35.050946 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.050908 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:35.151635 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.151523 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:35.176151 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.176103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs\") pod \"network-metrics-daemon-cl74p\" (UID: \"697ddfb3-adc9-4a63-b5ca-b4b871946a33\") " pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:35.176314 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.176242 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:35.177849 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.177557 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:35.177849 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.177596 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:35.177849 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.177610 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:35.177849 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.177755 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:35.177849 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.177809 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs podName:697ddfb3-adc9-4a63-b5ca-b4b871946a33 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:36.177790713 +0000 UTC m=+22.302318353 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs") pod "network-metrics-daemon-cl74p" (UID: "697ddfb3-adc9-4a63-b5ca-b4b871946a33") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:35.252109 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.252066 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:35.277003 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.276965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlkrx\" (UniqueName: \"kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx\") pod \"network-check-target-q67qs\" (UID: \"5f3c3f50-8ee2-4775-a3de-64e723a55361\") " pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:35.277195 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.277118 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:35.278185 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.278162 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:35.278310 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.278199 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:35.278310 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.278213 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:35.278408 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.278345 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:31:35.278408 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.278362 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:31:35.278408 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.278373 2576 projected.go:194] Error preparing data for projected volume kube-api-access-wlkrx for pod openshift-network-diagnostics/network-check-target-q67qs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:35.278539 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.278430 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx podName:5f3c3f50-8ee2-4775-a3de-64e723a55361 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:36.278409315 +0000 UTC m=+22.402936974 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wlkrx" (UniqueName: "kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx") pod "network-check-target-q67qs" (UID: "5f3c3f50-8ee2-4775-a3de-64e723a55361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:35.353029 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.352994 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:35.454199 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.454111 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:35.458421 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.458379 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:26:34 +0000 UTC" deadline="2027-11-25 01:32:49.366188568 +0000 UTC" Apr 16 18:31:35.458421 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.458419 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14095h1m13.907773524s" Apr 16 18:31:35.554794 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.554766 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:35.568338 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.568302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" event={"ID":"89c089f0-109d-41b7-8411-4debfff14a91","Type":"ContainerStarted","Data":"de8685428ec0474bd6dafd8bf74183c6df7fc389914e6b4d7b0839405365b734"} Apr 16 18:31:35.571631 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.571596 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jktqd" event={"ID":"a2547dac-9955-4de0-ba29-8ca57b537b69","Type":"ContainerStarted","Data":"a63c37119c4cf072c71cf9d07633f6c70470c8b2e388516381fbcb991ade66b5"} Apr 16 18:31:35.573624 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.573593 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn2l4" event={"ID":"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc","Type":"ContainerStarted","Data":"17e5da28a602f4cc6a06d15a2cd275b2d26430a8e69d26c2192b5a0cc55db6eb"} Apr 16 18:31:35.580491 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.580435 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hpgn4" event={"ID":"605560b8-ea93-4ad3-9510-324082bdc13f","Type":"ContainerStarted","Data":"9becad72b8ed0d87a375ae1313228d2d53e022f803344d3eb679f1900d4c4cfb"} Apr 16 18:31:35.588800 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.588683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p4gk6" event={"ID":"389b3d12-4b63-4bc3-9047-fb35ce314e95","Type":"ContainerStarted","Data":"d428b255e26cc9964233a2c51d02c7c6879b01ecb176c89270a020e91de5b262"} Apr 16 18:31:35.597269 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.596995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s46p6" event={"ID":"10b515ec-8868-47c8-b336-6809d8756b3b","Type":"ContainerStarted","Data":"3915dc4944facbfd65f0430fb0c09f17d90616f2ee4160d88dd5b8aceaa7e9f1"} Apr 16 18:31:35.608835 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.608774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" event={"ID":"d9e01863-f0f0-4a5e-935f-50699365f569","Type":"ContainerStarted","Data":"5c22e39bdcd2d6a553e667da7f9fea33f1d0a60521ef8df0baea9ae86531a888"} Apr 16 18:31:35.612645 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.612608 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bnn4v" event={"ID":"5b4ead4a-0df1-4977-9c23-9828175001c0","Type":"ContainerStarted","Data":"8fe4ee10d4998a7a04002803523f84dcbd12a7618776b06bbd8d23a85065a800"} Apr 16 18:31:35.616561 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.616527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8pbpt" event={"ID":"412ee020-08a8-48fa-a1be-e9b5ca0d1cb5","Type":"ContainerStarted","Data":"b8a12eab108ab6b96ea80d2f6b5b84b5babd83c8ef7df4b2d1da487e83dded20"} Apr 16 18:31:35.623977 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:35.623949 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/2.log" Apr 16 18:31:35.654919 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.654883 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:35.755898 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.755803 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:35.856834 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.856800 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:35.957689 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:35.957648 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:36.058651 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.058565 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:36.159467 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.159434 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:36.184021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.183980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs\") pod \"network-metrics-daemon-cl74p\" (UID: \"697ddfb3-adc9-4a63-b5ca-b4b871946a33\") " pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:36.184208 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.184145 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:36.186790 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.186764 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:36.186919 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.186803 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:36.186919 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.186817 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:36.187039 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.186951 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:36.187039 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.187007 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs podName:697ddfb3-adc9-4a63-b5ca-b4b871946a33 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:38.186985214 +0000 UTC m=+24.311512871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs") pod "network-metrics-daemon-cl74p" (UID: "697ddfb3-adc9-4a63-b5ca-b4b871946a33") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:36.260536 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.260495 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:36.285527 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.285144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlkrx\" (UniqueName: \"kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx\") pod \"network-check-target-q67qs\" (UID: \"5f3c3f50-8ee2-4775-a3de-64e723a55361\") " pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:36.285527 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.285294 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:36.286492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.286469 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:36.286622 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.286505 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:36.286622 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.286521 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:36.286715 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.286660 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:31:36.286715 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.286677 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:31:36.286715 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.286689 2576 projected.go:194] Error preparing data for projected volume kube-api-access-wlkrx for pod openshift-network-diagnostics/network-check-target-q67qs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:36.286863 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.286754 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx podName:5f3c3f50-8ee2-4775-a3de-64e723a55361 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:38.286720749 +0000 UTC m=+24.411248392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wlkrx" (UniqueName: "kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx") pod "network-check-target-q67qs" (UID: "5f3c3f50-8ee2-4775-a3de-64e723a55361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:36.361159 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.361075 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:36.459277 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.459222 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:26:34 +0000 UTC" deadline="2027-10-28 01:47:37.640726596 +0000 UTC" Apr 16 18:31:36.459277 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.459260 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13423h16m1.181471517s" Apr 16 18:31:36.461550 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.461464 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:36.525770 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.525335 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:36.525770 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.525381 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:36.525770 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.525620 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:36.525770 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.525647 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:36.531057 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.531030 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:36.531266 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.531251 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:36.531437 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.531424 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:36.531570 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.531279 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:36.531642 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.531588 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:36.531642 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:36.531603 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:36.531883 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.531853 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:36.532071 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.532048 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:36.562208 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.562156 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:36.662388 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.662285 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:36.763237 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.763182 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:36.863699 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.863647 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:36.964656 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:36.964547 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:37.065593 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:37.065522 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:37.165718 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:37.165677 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:37.266877 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:37.266793 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:37.366959 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:37.366905 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:37.467333 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:37.467293 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:37.568163 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:37.568085 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:37.668417 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:37.668363 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:37.769333 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:37.769301 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:37.869637 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:37.869514 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:37.969777 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:37.969728 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:38.070537 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.070504 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:38.171663 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.171563 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:38.201384 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.201314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs\") pod \"network-metrics-daemon-cl74p\" (UID: \"697ddfb3-adc9-4a63-b5ca-b4b871946a33\") " pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:38.201651 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.201464 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:38.202798 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.202660 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:38.202798 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.202694 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:38.202798 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.202709 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:38.203012 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.202857 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:38.203012 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.202909 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs podName:697ddfb3-adc9-4a63-b5ca-b4b871946a33 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:42.202890555 +0000 UTC m=+28.327418203 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs") pod "network-metrics-daemon-cl74p" (UID: "697ddfb3-adc9-4a63-b5ca-b4b871946a33") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:38.272678 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.272643 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:38.302391 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.302353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlkrx\" (UniqueName: \"kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx\") pod \"network-check-target-q67qs\" (UID: \"5f3c3f50-8ee2-4775-a3de-64e723a55361\") " pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:38.302578 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.302493 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:38.304722 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.303869 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:38.304722 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.303905 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:38.304722 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.303921 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:38.304722 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.304059 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:31:38.304722 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.304074 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:31:38.304722 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.304086 2576 projected.go:194] Error preparing data for projected volume kube-api-access-wlkrx for pod openshift-network-diagnostics/network-check-target-q67qs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:38.304722 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.304137 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx podName:5f3c3f50-8ee2-4775-a3de-64e723a55361 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:42.304117856 +0000 UTC m=+28.428645502 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wlkrx" (UniqueName: "kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx") pod "network-check-target-q67qs" (UID: "5f3c3f50-8ee2-4775-a3de-64e723a55361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:38.373822 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.373769 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:38.474295 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.474192 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:38.505372 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.505046 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:38.505372 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.505084 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:38.505372 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.505273 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:38.505372 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.505295 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:38.506768 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.506726 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:38.508041 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.506901 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:38.508041 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.506919 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:38.508041 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.507099 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:38.508041 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.507781 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:38.508041 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.507805 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:38.508041 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:38.507819 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:38.508041 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.507989 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:38.574777 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.574731 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:38.675341 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.675309 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:38.776082 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.775975 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:38.876918 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.876877 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:38.977735 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:38.977666 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:39.078335 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:39.078247 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:39.178912 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:39.178875 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:39.279598 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:39.279555 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:39.380503 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:39.380412 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:39.481546 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:39.481508 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:39.582373 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:39.582309 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:39.682873 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:39.682789 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:39.783694 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:39.783656 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:39.883901 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:39.883861 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:39.984898 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:39.984805 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:40.085645 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:40.085596 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:40.186301 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:40.186267 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:40.286988 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:40.286901 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:40.387193 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:40.387150 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:40.488261 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:40.488207 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:40.505792 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:40.505459 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:40.505792 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:40.505495 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:40.505792 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:40.505669 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:40.505792 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:40.505691 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:40.506789 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:40.506540 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:40.506789 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:40.506564 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:40.506789 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:40.506572 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:40.506789 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:40.506582 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:40.506789 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:40.506596 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:40.506789 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:40.506621 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:40.507157 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:40.506797 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:40.507157 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:40.506873 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:40.588775 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:40.588666 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:40.689726 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:40.689678 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:40.790310 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:40.790257 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:40.890664 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:40.890573 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:40.991359 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:40.991325 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:41.091994 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:41.091951 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:41.192853 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:41.192768 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:41.293922 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:41.293886 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:41.394526 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:41.394490 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:41.495702 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:41.495613 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:41.596532 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:41.596496 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:41.697078 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:41.697031 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:41.798118 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:41.798028 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:41.898643 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:41.898607 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:41.999463 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:41.999426 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:42.100155 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.100041 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:42.200215 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.200169 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:42.233783 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.233655 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs\") pod \"network-metrics-daemon-cl74p\" (UID: \"697ddfb3-adc9-4a63-b5ca-b4b871946a33\") " pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:42.233971 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.233811 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:42.234881 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.234857 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:42.235010 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.234897 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:42.235010 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.234911 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:42.235109 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.235037 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:42.235109 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.235092 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs podName:697ddfb3-adc9-4a63-b5ca-b4b871946a33 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:50.235072436 +0000 UTC m=+36.359600093 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs") pod "network-metrics-daemon-cl74p" (UID: "697ddfb3-adc9-4a63-b5ca-b4b871946a33") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:42.300998 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.300953 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:42.334941 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.334558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlkrx\" (UniqueName: \"kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx\") pod \"network-check-target-q67qs\" (UID: \"5f3c3f50-8ee2-4775-a3de-64e723a55361\") " pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:42.334941 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.334736 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:42.336967 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.336939 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:42.337084 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.336977 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:42.337084 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.336993 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:42.337232 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.337166 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:31:42.337232 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.337182 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:31:42.337232 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.337194 2576 projected.go:194] Error preparing data for projected volume kube-api-access-wlkrx for pod openshift-network-diagnostics/network-check-target-q67qs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:42.337369 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.337245 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx podName:5f3c3f50-8ee2-4775-a3de-64e723a55361 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:50.337224219 +0000 UTC m=+36.461751868 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wlkrx" (UniqueName: "kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx") pod "network-check-target-q67qs" (UID: "5f3c3f50-8ee2-4775-a3de-64e723a55361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:42.401708 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.401625 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:42.502167 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.501878 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:42.502167 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.501930 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:42.502167 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.501978 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:42.502167 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.502021 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:42.502167 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.502118 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:42.504238 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.503340 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:42.504238 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.503372 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:42.504238 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.503388 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:42.504238 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.503580 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:42.504238 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.503340 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:42.504238 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.504169 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:42.504238 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:42.504186 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:42.504620 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.504361 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:42.602683 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.602648 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:42.703794 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.703671 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:42.804597 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.804560 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:42.905126 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:42.905089 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:43.005773 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:43.005672 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:43.106301 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:43.106267 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:43.207004 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:43.206967 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:43.307851 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:43.307774 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:43.408281 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:43.408241 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:43.508530 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:43.508492 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:43.608772 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:43.608730 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:43.709468 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:43.709434 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:43.809951 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:43.809913 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:43.910889 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:43.910797 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:44.011470 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.011434 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:44.111684 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.111651 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:44.212872 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.212797 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:44.313216 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.313176 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:44.414279 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.414246 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:44.433777 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.433729 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:44.501676 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:44.501585 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:44.501676 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:44.501640 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:44.501904 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:44.501673 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:44.501904 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:44.501723 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:44.502946 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:44.502920 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:44.503073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:44.502961 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:44.503073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:44.502975 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:44.503073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:44.502922 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:44.503073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:44.503028 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:44.503073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:44.503044 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:44.503319 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.503146 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:44.503319 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.503195 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:44.514826 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.514789 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:44.615403 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.615368 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:44.664720 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.664684 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-154.ec2.internal\": node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:44.715850 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.715815 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:44.816014 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.815935 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:44.916797 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:44.916763 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:45.017803 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:45.017769 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:45.118525 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:45.118441 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:45.219466 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:45.219424 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:45.320181 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:45.320142 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:45.420541 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:45.420443 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:45.521085 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:45.521052 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:45.621391 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:45.621357 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:45.722356 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:45.722275 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:45.823380 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:45.823348 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:45.924141 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:45.924103 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:46.024732 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:46.024650 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:46.125447 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:46.125402 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:46.225925 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:46.225893 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:46.326791 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:46.326704 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:46.427630 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:46.427595 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:46.501553 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:46.501496 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:46.501553 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:46.501547 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:46.501814 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:46.501554 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:46.501814 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:46.501596 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:31:46.502787 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:46.502757 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:46.502909 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:46.502783 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:31:46.502909 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:46.502818 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:46.502909 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:46.502834 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:46.502909 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:46.502793 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:31:46.503115 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:46.502918 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:31:46.503115 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:46.503041 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:46.503213 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:46.503126 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:46.528457 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:46.528416 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:46.629119 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:46.629035 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:46.729932 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:46.729892 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:46.830604 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:46.830568 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:46.930998 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:46.930900 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:47.031757 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:47.031713 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:47.132852 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:47.132817 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:47.233805 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:47.233719 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:47.334574 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:47.334532 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:47.435691 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:47.435642 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:47.536729 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:47.536650 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:47.637626 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:47.637582 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:47.738457 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:47.738424 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:47.838828 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:47.838735 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:47.939622 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:47.939583 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:48.040244 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:48.040207 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:48.140946 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:48.140868 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:48.241615 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:48.241578 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:48.342369 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:48.342336 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:48.443240 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:48.443156 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-154.ec2.internal\" not found" Apr 16 18:31:48.489795 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:48.489762 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:31:48.501226 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:48.501191 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:48.501391 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:48.501319 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:48.501391 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:48.501372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:48.501506 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:48.501471 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:48.564167 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:48.564129 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" Apr 16 18:31:48.578432 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:48.578400 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal"] Apr 16 18:31:48.578611 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:48.578518 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:31:48.578657 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:48.578618 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-154.ec2.internal" Apr 16 18:31:48.578930 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:48.578895 2576 scope.go:117] "RemoveContainer" containerID="284fd67504784abbd9ab23c1bd3a8ecf717c8e781eb54c699fc73d346f64a03d" Apr 16 18:31:48.579135 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:48.579113 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_openshift-machine-config-operator(28c07a342a30c0e354482d7284dcbb2c)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" podUID="28c07a342a30c0e354482d7284dcbb2c" Apr 16 18:31:48.587769 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:48.587729 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:31:48.588547 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:48.588528 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-154.ec2.internal"] Apr 16 18:31:49.135536 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:49.135501 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:31:50.297594 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:50.297561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs\") pod \"network-metrics-daemon-cl74p\" (UID: \"697ddfb3-adc9-4a63-b5ca-b4b871946a33\") " pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:50.298031 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:50.297699 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:50.298031 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:50.297779 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs podName:697ddfb3-adc9-4a63-b5ca-b4b871946a33 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:06.297757419 +0000 UTC m=+52.422285082 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs") pod "network-metrics-daemon-cl74p" (UID: "697ddfb3-adc9-4a63-b5ca-b4b871946a33") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:50.398239 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:50.398201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlkrx\" (UniqueName: \"kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx\") pod \"network-check-target-q67qs\" (UID: \"5f3c3f50-8ee2-4775-a3de-64e723a55361\") " pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:50.398415 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:50.398372 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:31:50.398415 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:50.398397 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:31:50.398415 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:50.398410 2576 projected.go:194] Error preparing data for projected volume kube-api-access-wlkrx for pod openshift-network-diagnostics/network-check-target-q67qs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:50.398520 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:50.398469 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx podName:5f3c3f50-8ee2-4775-a3de-64e723a55361 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:06.398454501 +0000 UTC m=+52.522982142 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wlkrx" (UniqueName: "kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx") pod "network-check-target-q67qs" (UID: "5f3c3f50-8ee2-4775-a3de-64e723a55361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:50.501407 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:50.501373 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:50.501580 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:50.501508 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:50.501580 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:50.501563 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:50.501719 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:50.501691 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:51.660599 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:51.660407 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p4gk6" event={"ID":"389b3d12-4b63-4bc3-9047-fb35ce314e95","Type":"ContainerStarted","Data":"0ac7a8623ab481e5ec51c28928942faad030a961d4c11886625839f06e6fbc67"} Apr 16 18:31:52.253289 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.253046 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nsjd9"] Apr 16 18:31:52.256187 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.256160 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.259119 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.259073 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fwjvq\"" Apr 16 18:31:52.259119 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.259104 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:31:52.262444 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.260225 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:31:52.262444 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.260250 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:31:52.262444 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.260631 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:31:52.262444 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.261353 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:31:52.262713 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.262553 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:31:52.310679 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.310647 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07ae8592-c64d-4565-9efc-bc0241db5258-metrics-client-ca\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.310863 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.310692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.310863 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.310730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-tls\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.310863 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.310766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-wtmp\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.310863 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.310834 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/07ae8592-c64d-4565-9efc-bc0241db5258-root\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.311080 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.310873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-accelerators-collector-config\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.311080 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.310905 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s5fh\" (UniqueName: \"kubernetes.io/projected/07ae8592-c64d-4565-9efc-bc0241db5258-kube-api-access-8s5fh\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.311080 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.310962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07ae8592-c64d-4565-9efc-bc0241db5258-sys\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.311080 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.310985 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-textfile\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.411653 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.411618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-tls\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.411864 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.411665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-wtmp\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.411864 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.411698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/07ae8592-c64d-4565-9efc-bc0241db5258-root\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.411864 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.411723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-accelerators-collector-config\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.411864 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.411762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8s5fh\" (UniqueName: \"kubernetes.io/projected/07ae8592-c64d-4565-9efc-bc0241db5258-kube-api-access-8s5fh\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.411864 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.411809 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/07ae8592-c64d-4565-9efc-bc0241db5258-root\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.411864 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.411817 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07ae8592-c64d-4565-9efc-bc0241db5258-sys\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.411864 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:52.411833 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 18:31:52.412180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.411866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-wtmp\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.412180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.411882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07ae8592-c64d-4565-9efc-bc0241db5258-sys\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.412180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.411841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-textfile\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.412180 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:52.411909 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-tls podName:07ae8592-c64d-4565-9efc-bc0241db5258 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:52.911885615 +0000 UTC m=+39.036413270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-tls") pod "node-exporter-nsjd9" (UID: "07ae8592-c64d-4565-9efc-bc0241db5258") : secret "node-exporter-tls" not found Apr 16 18:31:52.412180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.412022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07ae8592-c64d-4565-9efc-bc0241db5258-metrics-client-ca\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.412180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.412050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.412490 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.412465 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-accelerators-collector-config\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.412657 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.412632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07ae8592-c64d-4565-9efc-bc0241db5258-metrics-client-ca\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.412784 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.412711 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-textfile\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.416334 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.416312 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.422180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.422157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s5fh\" (UniqueName: \"kubernetes.io/projected/07ae8592-c64d-4565-9efc-bc0241db5258-kube-api-access-8s5fh\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.501797 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.501732 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:52.501965 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:52.501891 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:52.501965 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.501902 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:52.502071 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:52.502001 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:52.665418 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.665372 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" event={"ID":"d9e01863-f0f0-4a5e-935f-50699365f569","Type":"ContainerStarted","Data":"1272e737dd5130e6ad48ba51d937f86e3e14fed598c35fb73bb48ac6ec254fcd"} Apr 16 18:31:52.665418 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.665425 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" event={"ID":"d9e01863-f0f0-4a5e-935f-50699365f569","Type":"ContainerStarted","Data":"ac0c9c5c01ded6d237d58db16d9ff45e26e009739a431bd14b50e8dbba15385a"} Apr 16 18:31:52.666258 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.665441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" event={"ID":"d9e01863-f0f0-4a5e-935f-50699365f569","Type":"ContainerStarted","Data":"9341eb66aa0c5bf8316a5a94c19e60f27a5695241e3a080f1948464c7a360d59"} Apr 16 18:31:52.666258 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.665456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" event={"ID":"d9e01863-f0f0-4a5e-935f-50699365f569","Type":"ContainerStarted","Data":"2bc9bbeee4a38849432fe06a22180348553922f895e177dbb4c09b097b4a04bc"} Apr 16 18:31:52.666258 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.665471 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" event={"ID":"d9e01863-f0f0-4a5e-935f-50699365f569","Type":"ContainerStarted","Data":"0c0ff319329de71cc969f457e7dcbc5161c9d597660a48aceadff026a6f2d389"} Apr 16 18:31:52.666258 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.665486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" event={"ID":"d9e01863-f0f0-4a5e-935f-50699365f569","Type":"ContainerStarted","Data":"c51325cfa49169ce395a8c7d895b90cbb5790e9bda1d92e060ab31b5a2ff75ec"} Apr 16 18:31:52.666752 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.666710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bnn4v" event={"ID":"5b4ead4a-0df1-4977-9c23-9828175001c0","Type":"ContainerStarted","Data":"c572580be8c18d866e85c2726a935ecb09028d61ca6664e42f72a032358af6aa"} Apr 16 18:31:52.668034 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.668008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8pbpt" event={"ID":"412ee020-08a8-48fa-a1be-e9b5ca0d1cb5","Type":"ContainerStarted","Data":"531e7fc55ac4c684b5aa719e4cdfbcfa26673872b630f3ba9612280257802812"} Apr 16 18:31:52.669325 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.669301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" event={"ID":"89c089f0-109d-41b7-8411-4debfff14a91","Type":"ContainerStarted","Data":"2db455a76aaa854b3a0bab76e49640473f6d863d1cb254f7116854e9750ec417"} Apr 16 18:31:52.670662 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.670630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jktqd" event={"ID":"a2547dac-9955-4de0-ba29-8ca57b537b69","Type":"ContainerStarted","Data":"5cf4c672742ec1582fb1d0cc20032d98ae3cd76733c98adfe6da5375bbe229ff"} Apr 16 18:31:52.672071 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.672049 2576 generic.go:358] "Generic (PLEG): container finished" podID="48a38ebc-2033-4b86-99f7-c22d4b6e6ccc" containerID="526ea3268e2ccb3c5cf048bf11c9f2522dac08457acfa935895faaaf2d33ff86" exitCode=0 Apr 16 18:31:52.672159 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.672115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn2l4" event={"ID":"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc","Type":"ContainerDied","Data":"526ea3268e2ccb3c5cf048bf11c9f2522dac08457acfa935895faaaf2d33ff86"} Apr 16 18:31:52.673382 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.673358 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s46p6" event={"ID":"10b515ec-8868-47c8-b336-6809d8756b3b","Type":"ContainerStarted","Data":"f4f4bd0a63ab19cfd59e2f1bf3e96f49e4f5f81f5822441d3023a37c61526d53"} Apr 16 18:31:52.684786 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.684721 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bnn4v" podStartSLOduration=2.00400756 podStartE2EDuration="18.684706203s" podCreationTimestamp="2026-04-16 18:31:34 +0000 UTC" firstStartedPulling="2026-04-16 18:31:34.803603206 +0000 UTC m=+20.928130847" lastFinishedPulling="2026-04-16 18:31:51.484301848 +0000 UTC m=+37.608829490" observedRunningTime="2026-04-16 18:31:52.6842611 +0000 UTC m=+38.808788774" watchObservedRunningTime="2026-04-16 18:31:52.684706203 +0000 UTC m=+38.809233865" Apr 16 18:31:52.685129 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.685099 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-154.ec2.internal" podStartSLOduration=4.685092976 podStartE2EDuration="4.685092976s" podCreationTimestamp="2026-04-16 18:31:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:31:52.360428122 +0000 UTC m=+38.484955784" watchObservedRunningTime="2026-04-16 18:31:52.685092976 +0000 UTC m=+38.809620691" Apr 16 18:31:52.699894 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.699853 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8pbpt" podStartSLOduration=1.9130310179999999 podStartE2EDuration="18.699838109s" podCreationTimestamp="2026-04-16 18:31:34 +0000 UTC" firstStartedPulling="2026-04-16 18:31:34.715759585 +0000 UTC m=+20.840287241" lastFinishedPulling="2026-04-16 18:31:51.502566686 +0000 UTC m=+37.627094332" observedRunningTime="2026-04-16 18:31:52.699736447 +0000 UTC m=+38.824264111" watchObservedRunningTime="2026-04-16 18:31:52.699838109 +0000 UTC m=+38.824365773" Apr 16 18:31:52.720145 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.719976 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:31:52.770257 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.770198 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-s46p6" podStartSLOduration=2.168468676 podStartE2EDuration="18.770181873s" podCreationTimestamp="2026-04-16 18:31:34 +0000 UTC" firstStartedPulling="2026-04-16 18:31:34.88258482 +0000 UTC m=+21.007112461" lastFinishedPulling="2026-04-16 18:31:51.484298017 +0000 UTC m=+37.608825658" observedRunningTime="2026-04-16 18:31:52.749703304 +0000 UTC m=+38.874230967" watchObservedRunningTime="2026-04-16 18:31:52.770181873 +0000 UTC m=+38.894709535" Apr 16 18:31:52.770543 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.770511 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jktqd" podStartSLOduration=2.1739602160000002 podStartE2EDuration="18.770501869s" podCreationTimestamp="2026-04-16 18:31:34 +0000 UTC" firstStartedPulling="2026-04-16 18:31:34.925329852 +0000 UTC m=+21.049857493" lastFinishedPulling="2026-04-16 18:31:51.521871505 +0000 UTC m=+37.646399146" observedRunningTime="2026-04-16 18:31:52.769344853 +0000 UTC m=+38.893872516" watchObservedRunningTime="2026-04-16 18:31:52.770501869 +0000 UTC m=+38.895029551" Apr 16 18:31:52.789345 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.789288 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-p4gk6" podStartSLOduration=2.2799007 podStartE2EDuration="18.789274722s" podCreationTimestamp="2026-04-16 18:31:34 +0000 UTC" firstStartedPulling="2026-04-16 18:31:34.97494028 +0000 UTC m=+21.099467937" lastFinishedPulling="2026-04-16 18:31:51.484314303 +0000 UTC m=+37.608841959" observedRunningTime="2026-04-16 18:31:52.788644101 +0000 UTC m=+38.913171765" watchObservedRunningTime="2026-04-16 18:31:52.789274722 +0000 UTC m=+38.913802384" Apr 16 18:31:52.916550 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.916426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-tls\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:52.919544 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:52.919521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/07ae8592-c64d-4565-9efc-bc0241db5258-node-exporter-tls\") pod \"node-exporter-nsjd9\" (UID: \"07ae8592-c64d-4565-9efc-bc0241db5258\") " pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:53.167695 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:53.167530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nsjd9" Apr 16 18:31:53.177536 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:31:53.177477 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07ae8592_c64d_4565_9efc_bc0241db5258.slice/crio-99a8326cf3a923c8e656607359cdba09a00596e06b932bc89d3385c56b69ba37 WatchSource:0}: Error finding container 99a8326cf3a923c8e656607359cdba09a00596e06b932bc89d3385c56b69ba37: Status 404 returned error can't find the container with id 99a8326cf3a923c8e656607359cdba09a00596e06b932bc89d3385c56b69ba37 Apr 16 18:31:53.456101 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:53.455915 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:31:52.72014018Z","UUID":"9117f430-d619-472f-8d3b-a442ba1d5479","Handler":null,"Name":"","Endpoint":""} Apr 16 18:31:53.457823 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:53.457731 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:31:53.457823 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:53.457775 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:31:53.678838 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:53.678801 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" event={"ID":"89c089f0-109d-41b7-8411-4debfff14a91","Type":"ContainerStarted","Data":"7e727f7754fcc371b1627aac7e1878d0211ca6d09d1fa120a78be6b868d41cb3"} Apr 16 18:31:53.680771 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:53.680724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hpgn4" event={"ID":"605560b8-ea93-4ad3-9510-324082bdc13f","Type":"ContainerStarted","Data":"dbed7eea882ba9c9712180501f2ce096bb28c097ccda857c0726afdbf587e770"} Apr 16 18:31:53.682119 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:53.682064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nsjd9" event={"ID":"07ae8592-c64d-4565-9efc-bc0241db5258","Type":"ContainerStarted","Data":"99a8326cf3a923c8e656607359cdba09a00596e06b932bc89d3385c56b69ba37"} Apr 16 18:31:53.698130 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:53.698059 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hpgn4" podStartSLOduration=2.976744481 podStartE2EDuration="19.69803972s" podCreationTimestamp="2026-04-16 18:31:34 +0000 UTC" firstStartedPulling="2026-04-16 18:31:34.787822998 +0000 UTC m=+20.912350639" lastFinishedPulling="2026-04-16 18:31:51.509118222 +0000 UTC m=+37.633645878" observedRunningTime="2026-04-16 18:31:53.697783278 +0000 UTC m=+39.822310943" watchObservedRunningTime="2026-04-16 18:31:53.69803972 +0000 UTC m=+39.822567387" Apr 16 18:31:54.502379 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:54.502302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:54.502546 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:54.502407 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:54.502546 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:54.502510 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:54.502655 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:54.502634 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:54.685898 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:54.685859 2576 generic.go:358] "Generic (PLEG): container finished" podID="07ae8592-c64d-4565-9efc-bc0241db5258" containerID="933d764704c9c3fd048e01a9d988f022bed3d2f160fe3cb7f1972194c253e869" exitCode=0 Apr 16 18:31:54.686634 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:54.685942 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nsjd9" event={"ID":"07ae8592-c64d-4565-9efc-bc0241db5258","Type":"ContainerDied","Data":"933d764704c9c3fd048e01a9d988f022bed3d2f160fe3cb7f1972194c253e869"} Apr 16 18:31:54.689286 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:54.689256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" event={"ID":"d9e01863-f0f0-4a5e-935f-50699365f569","Type":"ContainerStarted","Data":"f93b28b52b3558c47594a428046bbc23b4604fc2086e773539c61f6cab5bc6be"} Apr 16 18:31:54.691517 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:54.691486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" event={"ID":"89c089f0-109d-41b7-8411-4debfff14a91","Type":"ContainerStarted","Data":"67386e278dc96985b4be366d1b4124ff7ecb38af3d3fab4403381333baab8069"} Apr 16 18:31:54.729088 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:54.729030 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn5nh" podStartSLOduration=2.118611504 podStartE2EDuration="20.72900935s" podCreationTimestamp="2026-04-16 18:31:34 +0000 UTC" firstStartedPulling="2026-04-16 18:31:34.977364648 +0000 UTC m=+21.101892290" lastFinishedPulling="2026-04-16 18:31:53.58776249 +0000 UTC m=+39.712290136" observedRunningTime="2026-04-16 18:31:54.728357506 +0000 UTC m=+40.852885192" watchObservedRunningTime="2026-04-16 18:31:54.72900935 +0000 UTC m=+40.853537014" Apr 16 18:31:54.965999 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:54.965963 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-p4gk6" Apr 16 18:31:54.966766 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:54.966724 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-p4gk6" Apr 16 18:31:55.695523 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:55.695480 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nsjd9" event={"ID":"07ae8592-c64d-4565-9efc-bc0241db5258","Type":"ContainerStarted","Data":"84ae18f342b6e124e2f47e575bc340a53ba6c08f5856fdbab9f04d1460899f56"} Apr 16 18:31:55.696013 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:55.695532 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nsjd9" event={"ID":"07ae8592-c64d-4565-9efc-bc0241db5258","Type":"ContainerStarted","Data":"fee6c8930925edd9aa0de245cd70744493c050c4b981d82ad0b7a5908f21c492"} Apr 16 18:31:55.696013 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:55.695687 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-p4gk6" Apr 16 18:31:55.696337 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:55.696320 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-p4gk6" Apr 16 18:31:55.719293 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:55.719244 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nsjd9" podStartSLOduration=2.883704452 podStartE2EDuration="3.719230458s" podCreationTimestamp="2026-04-16 18:31:52 +0000 UTC" firstStartedPulling="2026-04-16 18:31:53.179703955 +0000 UTC m=+39.304231600" lastFinishedPulling="2026-04-16 18:31:54.015229952 +0000 UTC m=+40.139757606" observedRunningTime="2026-04-16 18:31:55.719073702 +0000 UTC m=+41.843601366" watchObservedRunningTime="2026-04-16 18:31:55.719230458 +0000 UTC m=+41.843758122" Apr 16 18:31:56.501925 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:56.501677 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:56.502113 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:56.501771 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:56.502179 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:56.502016 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:56.502179 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:56.502152 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:57.700643 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:57.700613 2576 generic.go:358] "Generic (PLEG): container finished" podID="48a38ebc-2033-4b86-99f7-c22d4b6e6ccc" containerID="78c582223463a680c1dbe75cc1b8ed7e17f87dbd28e389f9f32402a1134117ef" exitCode=0 Apr 16 18:31:57.701212 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:57.700702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn2l4" event={"ID":"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc","Type":"ContainerDied","Data":"78c582223463a680c1dbe75cc1b8ed7e17f87dbd28e389f9f32402a1134117ef"} Apr 16 18:31:57.707373 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:57.707343 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" event={"ID":"d9e01863-f0f0-4a5e-935f-50699365f569","Type":"ContainerStarted","Data":"7e7372d197a2169bc512e9b94d9e700963eba997011c39aef97d471dace3ccf9"} Apr 16 18:31:57.707675 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:57.707652 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:57.707780 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:57.707679 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:57.707780 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:57.707691 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:57.722478 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:57.722453 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:57.726210 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:57.726194 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:31:57.754962 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:57.754940 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:31:57.757876 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:57.757834 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" podStartSLOduration=6.885030281 podStartE2EDuration="23.757822673s" podCreationTimestamp="2026-04-16 18:31:34 +0000 UTC" firstStartedPulling="2026-04-16 18:31:34.87739673 +0000 UTC m=+21.001924372" lastFinishedPulling="2026-04-16 18:31:51.750189109 +0000 UTC m=+37.874716764" observedRunningTime="2026-04-16 18:31:57.756588556 +0000 UTC m=+43.881116219" watchObservedRunningTime="2026-04-16 18:31:57.757822673 +0000 UTC m=+43.882350335" Apr 16 18:31:58.501157 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:58.500988 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:58.501157 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:58.501014 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:58.501288 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:58.501148 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:58.501599 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:58.501572 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:58.656864 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:58.656834 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-q67qs"] Apr 16 18:31:58.662377 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:58.662346 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cl74p"] Apr 16 18:31:58.709017 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:58.708945 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:31:58.709017 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:58.708954 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:31:58.709581 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:58.709043 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:31:58.709581 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:31:58.709377 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:31:59.712891 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:59.712855 2576 generic.go:358] "Generic (PLEG): container finished" podID="48a38ebc-2033-4b86-99f7-c22d4b6e6ccc" containerID="8d23974b87bd81ba7c6df19368e402179b7fdd6ac55cb7272c3cb653ca55805d" exitCode=0 Apr 16 18:31:59.713305 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:31:59.712941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn2l4" event={"ID":"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc","Type":"ContainerDied","Data":"8d23974b87bd81ba7c6df19368e402179b7fdd6ac55cb7272c3cb653ca55805d"} Apr 16 18:32:00.501567 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:00.501532 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:32:00.501567 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:00.501554 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:32:00.501808 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:32:00.501684 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:32:00.501888 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:32:00.501853 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:32:01.719457 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:01.719277 2576 generic.go:358] "Generic (PLEG): container finished" podID="48a38ebc-2033-4b86-99f7-c22d4b6e6ccc" containerID="d45fba4151503f40ab3f35c9b4141537d5c0887f9cc99b76f7a5e9389808488a" exitCode=0 Apr 16 18:32:01.719831 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:01.719361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn2l4" event={"ID":"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc","Type":"ContainerDied","Data":"d45fba4151503f40ab3f35c9b4141537d5c0887f9cc99b76f7a5e9389808488a"} Apr 16 18:32:02.501068 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:02.500990 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:32:02.501905 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:32:02.501860 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q67qs" podUID="5f3c3f50-8ee2-4775-a3de-64e723a55361" Apr 16 18:32:02.502274 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:02.502256 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:32:02.502422 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:32:02.502402 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cl74p" podUID="697ddfb3-adc9-4a63-b5ca-b4b871946a33" Apr 16 18:32:02.504905 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:02.501731 2576 scope.go:117] "RemoveContainer" containerID="284fd67504784abbd9ab23c1bd3a8ecf717c8e781eb54c699fc73d346f64a03d" Apr 16 18:32:02.723520 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:02.723488 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/2.log" Apr 16 18:32:02.724074 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:02.724049 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" event={"ID":"28c07a342a30c0e354482d7284dcbb2c","Type":"ContainerStarted","Data":"67b3ba668e3372509f41d3aed9c1b109c43ec047b10b5bf14138127e0ed3df21"} Apr 16 18:32:02.742588 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:02.742530 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal" podStartSLOduration=14.742509243 podStartE2EDuration="14.742509243s" podCreationTimestamp="2026-04-16 18:31:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:32:02.741628148 +0000 UTC m=+48.866155810" watchObservedRunningTime="2026-04-16 18:32:02.742509243 +0000 UTC m=+48.867036908" Apr 16 18:32:04.209193 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.209159 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-154.ec2.internal" event="NodeReady" Apr 16 18:32:04.209630 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.209335 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:32:04.257775 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.257700 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-g54vn"] Apr 16 18:32:04.279600 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.279572 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8wfh2"] Apr 16 18:32:04.279811 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.279791 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.282681 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.282652 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mnpbz\"" Apr 16 18:32:04.282843 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.282701 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:32:04.282843 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.282652 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:32:04.300537 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.300493 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g54vn"] Apr 16 18:32:04.300537 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.300535 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8wfh2"] Apr 16 18:32:04.300783 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.300549 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fphbb"] Apr 16 18:32:04.300783 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.300667 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8wfh2" Apr 16 18:32:04.303789 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.303763 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-44gxs\"" Apr 16 18:32:04.304099 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.304055 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:32:04.304186 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.304098 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:32:04.308321 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.308035 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:32:04.326519 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.326422 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fphbb"] Apr 16 18:32:04.326691 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.326584 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.329615 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.329420 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:32:04.329615 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.329420 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:32:04.329615 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.329488 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-sjnps\"" Apr 16 18:32:04.329615 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.329527 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:32:04.330016 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.329794 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:32:04.406405 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.406372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.406405 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.406417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49dc18c9-317a-4547-b453-9328583c7dce-tmp-dir\") pod \"dns-default-g54vn\" (UID: \"49dc18c9-317a-4547-b453-9328583c7dce\") " pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.406643 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.406442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-crio-socket\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.406643 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.406506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49dc18c9-317a-4547-b453-9328583c7dce-config-volume\") pod \"dns-default-g54vn\" (UID: \"49dc18c9-317a-4547-b453-9328583c7dce\") " pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.406643 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.406549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49dc18c9-317a-4547-b453-9328583c7dce-metrics-tls\") pod \"dns-default-g54vn\" (UID: \"49dc18c9-317a-4547-b453-9328583c7dce\") " pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.406643 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.406574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl22p\" (UniqueName: \"kubernetes.io/projected/49dc18c9-317a-4547-b453-9328583c7dce-kube-api-access-cl22p\") pod \"dns-default-g54vn\" (UID: \"49dc18c9-317a-4547-b453-9328583c7dce\") " pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.406643 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.406596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.406643 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.406637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7xwx\" (UniqueName: \"kubernetes.io/projected/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-kube-api-access-q7xwx\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.406946 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.406678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bfb19bf-a34c-40ab-8651-16b8064977ed-cert\") pod \"ingress-canary-8wfh2\" (UID: \"4bfb19bf-a34c-40ab-8651-16b8064977ed\") " pod="openshift-ingress-canary/ingress-canary-8wfh2" Apr 16 18:32:04.406946 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.406736 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4w2r\" (UniqueName: \"kubernetes.io/projected/4bfb19bf-a34c-40ab-8651-16b8064977ed-kube-api-access-p4w2r\") pod \"ingress-canary-8wfh2\" (UID: \"4bfb19bf-a34c-40ab-8651-16b8064977ed\") " pod="openshift-ingress-canary/ingress-canary-8wfh2" Apr 16 18:32:04.406946 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.406794 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-data-volume\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.502228 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.502189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:32:04.502485 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.502375 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:32:04.505601 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.505558 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2j6pd\"" Apr 16 18:32:04.505784 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.505760 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:32:04.506018 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.505999 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:32:04.506113 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.505999 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ds8t7\"" Apr 16 18:32:04.506113 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.506075 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:32:04.507187 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.507169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bfb19bf-a34c-40ab-8651-16b8064977ed-cert\") pod \"ingress-canary-8wfh2\" (UID: \"4bfb19bf-a34c-40ab-8651-16b8064977ed\") " pod="openshift-ingress-canary/ingress-canary-8wfh2" Apr 16 18:32:04.507288 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.507206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4w2r\" (UniqueName: \"kubernetes.io/projected/4bfb19bf-a34c-40ab-8651-16b8064977ed-kube-api-access-p4w2r\") pod \"ingress-canary-8wfh2\" (UID: \"4bfb19bf-a34c-40ab-8651-16b8064977ed\") " pod="openshift-ingress-canary/ingress-canary-8wfh2" Apr 16 18:32:04.507288 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.507235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-data-volume\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.507404 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.507311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.507404 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.507340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49dc18c9-317a-4547-b453-9328583c7dce-tmp-dir\") pod \"dns-default-g54vn\" (UID: \"49dc18c9-317a-4547-b453-9328583c7dce\") " pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.507404 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.507364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-crio-socket\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.507404 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.507391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49dc18c9-317a-4547-b453-9328583c7dce-config-volume\") pod \"dns-default-g54vn\" (UID: \"49dc18c9-317a-4547-b453-9328583c7dce\") " pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.507596 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.507413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49dc18c9-317a-4547-b453-9328583c7dce-metrics-tls\") pod \"dns-default-g54vn\" (UID: \"49dc18c9-317a-4547-b453-9328583c7dce\") " pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.507596 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.507438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cl22p\" (UniqueName: \"kubernetes.io/projected/49dc18c9-317a-4547-b453-9328583c7dce-kube-api-access-cl22p\") pod \"dns-default-g54vn\" (UID: \"49dc18c9-317a-4547-b453-9328583c7dce\") " pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.507596 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.507462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.507596 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.507505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7xwx\" (UniqueName: \"kubernetes.io/projected/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-kube-api-access-q7xwx\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.507811 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.507640 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-data-volume\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.508089 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.508056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49dc18c9-317a-4547-b453-9328583c7dce-tmp-dir\") pod \"dns-default-g54vn\" (UID: \"49dc18c9-317a-4547-b453-9328583c7dce\") " pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.508194 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.508056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-crio-socket\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.508350 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.508329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49dc18c9-317a-4547-b453-9328583c7dce-config-volume\") pod \"dns-default-g54vn\" (UID: \"49dc18c9-317a-4547-b453-9328583c7dce\") " pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.508424 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.508404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.512461 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.512374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49dc18c9-317a-4547-b453-9328583c7dce-metrics-tls\") pod \"dns-default-g54vn\" (UID: \"49dc18c9-317a-4547-b453-9328583c7dce\") " pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.512461 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.512377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.512642 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.512505 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bfb19bf-a34c-40ab-8651-16b8064977ed-cert\") pod \"ingress-canary-8wfh2\" (UID: \"4bfb19bf-a34c-40ab-8651-16b8064977ed\") " pod="openshift-ingress-canary/ingress-canary-8wfh2" Apr 16 18:32:04.516541 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.516514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4w2r\" (UniqueName: \"kubernetes.io/projected/4bfb19bf-a34c-40ab-8651-16b8064977ed-kube-api-access-p4w2r\") pod \"ingress-canary-8wfh2\" (UID: \"4bfb19bf-a34c-40ab-8651-16b8064977ed\") " pod="openshift-ingress-canary/ingress-canary-8wfh2" Apr 16 18:32:04.516852 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.516826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7xwx\" (UniqueName: \"kubernetes.io/projected/06adbfd4-9e4a-4840-99c5-dcb8e483a0bd-kube-api-access-q7xwx\") pod \"insights-runtime-extractor-fphbb\" (UID: \"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd\") " pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.519373 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.519352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl22p\" (UniqueName: \"kubernetes.io/projected/49dc18c9-317a-4547-b453-9328583c7dce-kube-api-access-cl22p\") pod \"dns-default-g54vn\" (UID: \"49dc18c9-317a-4547-b453-9328583c7dce\") " pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.591394 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.591302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:04.613344 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.613311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8wfh2" Apr 16 18:32:04.637095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.636977 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fphbb" Apr 16 18:32:04.817094 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.816896 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g54vn"] Apr 16 18:32:04.831308 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:32:04.831275 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49dc18c9_317a_4547_b453_9328583c7dce.slice/crio-0eeb8e5c3a199b59321f89625307fb3a9af765f1e08323ffc252379072c3b18c WatchSource:0}: Error finding container 0eeb8e5c3a199b59321f89625307fb3a9af765f1e08323ffc252379072c3b18c: Status 404 returned error can't find the container with id 0eeb8e5c3a199b59321f89625307fb3a9af765f1e08323ffc252379072c3b18c Apr 16 18:32:04.837568 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.837541 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8wfh2"] Apr 16 18:32:04.840663 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:04.840619 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fphbb"] Apr 16 18:32:04.841473 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:32:04.841421 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bfb19bf_a34c_40ab_8651_16b8064977ed.slice/crio-af966733d67aaeba33df54f756a01be5cd90efb736b21ff47b93aedaf9a17492 WatchSource:0}: Error finding container af966733d67aaeba33df54f756a01be5cd90efb736b21ff47b93aedaf9a17492: Status 404 returned error can't find the container with id af966733d67aaeba33df54f756a01be5cd90efb736b21ff47b93aedaf9a17492 Apr 16 18:32:04.844830 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:32:04.844734 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06adbfd4_9e4a_4840_99c5_dcb8e483a0bd.slice/crio-1aea31b085cdb105092e10c3a577907c9c02b1df358bda34824228fe2833465f WatchSource:0}: Error finding container 1aea31b085cdb105092e10c3a577907c9c02b1df358bda34824228fe2833465f: Status 404 returned error can't find the container with id 1aea31b085cdb105092e10c3a577907c9c02b1df358bda34824228fe2833465f Apr 16 18:32:05.743820 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:05.743776 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fphbb" event={"ID":"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd","Type":"ContainerStarted","Data":"7e65571c30daf7172934d20cd7a44ea3ae714974a2dc663aa9947df3ee559770"} Apr 16 18:32:05.743820 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:05.743828 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fphbb" event={"ID":"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd","Type":"ContainerStarted","Data":"1aea31b085cdb105092e10c3a577907c9c02b1df358bda34824228fe2833465f"} Apr 16 18:32:05.745479 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:05.745444 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8wfh2" event={"ID":"4bfb19bf-a34c-40ab-8651-16b8064977ed","Type":"ContainerStarted","Data":"af966733d67aaeba33df54f756a01be5cd90efb736b21ff47b93aedaf9a17492"} Apr 16 18:32:05.746604 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:05.746573 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g54vn" event={"ID":"49dc18c9-317a-4547-b453-9328583c7dce","Type":"ContainerStarted","Data":"0eeb8e5c3a199b59321f89625307fb3a9af765f1e08323ffc252379072c3b18c"} Apr 16 18:32:06.326937 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:06.326897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs\") pod \"network-metrics-daemon-cl74p\" (UID: \"697ddfb3-adc9-4a63-b5ca-b4b871946a33\") " pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:32:06.331023 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:06.330979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697ddfb3-adc9-4a63-b5ca-b4b871946a33-metrics-certs\") pod \"network-metrics-daemon-cl74p\" (UID: \"697ddfb3-adc9-4a63-b5ca-b4b871946a33\") " pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:32:06.427671 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:06.427606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlkrx\" (UniqueName: \"kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx\") pod \"network-check-target-q67qs\" (UID: \"5f3c3f50-8ee2-4775-a3de-64e723a55361\") " pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:32:06.430370 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:06.430343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlkrx\" (UniqueName: \"kubernetes.io/projected/5f3c3f50-8ee2-4775-a3de-64e723a55361-kube-api-access-wlkrx\") pod \"network-check-target-q67qs\" (UID: \"5f3c3f50-8ee2-4775-a3de-64e723a55361\") " pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:32:06.631232 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:06.631141 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cl74p" Apr 16 18:32:06.638551 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:06.638513 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:32:09.233166 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:09.233000 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-q67qs"] Apr 16 18:32:09.242201 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:09.242070 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cl74p"] Apr 16 18:32:09.244589 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:32:09.242896 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f3c3f50_8ee2_4775_a3de_64e723a55361.slice/crio-d2bfeed8326fe6119ee3adeeef5a24ddf0612ad8bebcb711b7d3199ea2903b35 WatchSource:0}: Error finding container d2bfeed8326fe6119ee3adeeef5a24ddf0612ad8bebcb711b7d3199ea2903b35: Status 404 returned error can't find the container with id d2bfeed8326fe6119ee3adeeef5a24ddf0612ad8bebcb711b7d3199ea2903b35 Apr 16 18:32:09.250682 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:32:09.250651 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod697ddfb3_adc9_4a63_b5ca_b4b871946a33.slice/crio-632aab38a5d598eb91583f6acd1c5c08a8ae7186345196dab77eab87cc5e8815 WatchSource:0}: Error finding container 632aab38a5d598eb91583f6acd1c5c08a8ae7186345196dab77eab87cc5e8815: Status 404 returned error can't find the container with id 632aab38a5d598eb91583f6acd1c5c08a8ae7186345196dab77eab87cc5e8815 Apr 16 18:32:09.760749 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:09.760545 2576 generic.go:358] "Generic (PLEG): container finished" podID="48a38ebc-2033-4b86-99f7-c22d4b6e6ccc" containerID="ef745253685de6c27df3030331e642ca86a587d53f89a8fc74e3627927931e11" exitCode=0 Apr 16 18:32:09.760969 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:09.760640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn2l4" event={"ID":"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc","Type":"ContainerDied","Data":"ef745253685de6c27df3030331e642ca86a587d53f89a8fc74e3627927931e11"} Apr 16 18:32:09.763359 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:09.763328 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g54vn" event={"ID":"49dc18c9-317a-4547-b453-9328583c7dce","Type":"ContainerStarted","Data":"d6b3f1df2cb94367b307aa7b2b4b9693bbfa36c0912c2115a0f400810e012ba7"} Apr 16 18:32:09.763465 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:09.763370 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g54vn" event={"ID":"49dc18c9-317a-4547-b453-9328583c7dce","Type":"ContainerStarted","Data":"c9242e963bde15746fd336a863641b15ce51c47e1e93df147c1ef7ffc0a5a550"} Apr 16 18:32:09.763650 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:09.763620 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:09.766411 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:09.766363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cl74p" event={"ID":"697ddfb3-adc9-4a63-b5ca-b4b871946a33","Type":"ContainerStarted","Data":"632aab38a5d598eb91583f6acd1c5c08a8ae7186345196dab77eab87cc5e8815"} Apr 16 18:32:09.768541 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:09.768512 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-q67qs" event={"ID":"5f3c3f50-8ee2-4775-a3de-64e723a55361","Type":"ContainerStarted","Data":"d2bfeed8326fe6119ee3adeeef5a24ddf0612ad8bebcb711b7d3199ea2903b35"} Apr 16 18:32:09.770876 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:09.770851 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fphbb" event={"ID":"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd","Type":"ContainerStarted","Data":"12a90f1792999420053f8ff93b4bc9817c41a67d3cf1f75780ed0b26a0b067f5"} Apr 16 18:32:09.776839 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:09.776807 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8wfh2" event={"ID":"4bfb19bf-a34c-40ab-8651-16b8064977ed","Type":"ContainerStarted","Data":"7bb10569906f6912c4251d92d8644a92a8268c5f19059e957bf70e3fdf22cc19"} Apr 16 18:32:09.809434 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:09.809327 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8wfh2" podStartSLOduration=1.58885074 podStartE2EDuration="5.809307223s" podCreationTimestamp="2026-04-16 18:32:04 +0000 UTC" firstStartedPulling="2026-04-16 18:32:04.844306164 +0000 UTC m=+50.968833811" lastFinishedPulling="2026-04-16 18:32:09.064762632 +0000 UTC m=+55.189290294" observedRunningTime="2026-04-16 18:32:09.808433765 +0000 UTC m=+55.932961413" watchObservedRunningTime="2026-04-16 18:32:09.809307223 +0000 UTC m=+55.933834886" Apr 16 18:32:10.783732 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:10.783696 2576 generic.go:358] "Generic (PLEG): container finished" podID="48a38ebc-2033-4b86-99f7-c22d4b6e6ccc" containerID="82b1e62cce4daac199c3741826d7ce2fa23fc4bde0f9eefff441c3384a852f20" exitCode=0 Apr 16 18:32:10.784379 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:10.783788 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn2l4" event={"ID":"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc","Type":"ContainerDied","Data":"82b1e62cce4daac199c3741826d7ce2fa23fc4bde0f9eefff441c3384a852f20"} Apr 16 18:32:10.812088 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:10.811390 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-g54vn" podStartSLOduration=2.582959134 podStartE2EDuration="6.811370516s" podCreationTimestamp="2026-04-16 18:32:04 +0000 UTC" firstStartedPulling="2026-04-16 18:32:04.833543682 +0000 UTC m=+50.958071323" lastFinishedPulling="2026-04-16 18:32:09.061955044 +0000 UTC m=+55.186482705" observedRunningTime="2026-04-16 18:32:09.829354735 +0000 UTC m=+55.953882401" watchObservedRunningTime="2026-04-16 18:32:10.811370516 +0000 UTC m=+56.935898181" Apr 16 18:32:11.789011 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:11.788972 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cl74p" event={"ID":"697ddfb3-adc9-4a63-b5ca-b4b871946a33","Type":"ContainerStarted","Data":"ac6a495f2110b9833f6bae1cfb344f2d0a2637779ed9f85bd8443bba11421ca8"} Apr 16 18:32:11.790912 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:11.790884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fphbb" event={"ID":"06adbfd4-9e4a-4840-99c5-dcb8e483a0bd","Type":"ContainerStarted","Data":"a33c733419ff0af022584b21b8e58d35275e510f66ecf81b726bfa9455af92b0"} Apr 16 18:32:11.794250 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:11.794225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn2l4" event={"ID":"48a38ebc-2033-4b86-99f7-c22d4b6e6ccc","Type":"ContainerStarted","Data":"8f19724ffae941296653dc186e94b576263697141ce52ecaab3603c2334090e2"} Apr 16 18:32:11.835680 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:11.835616 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wn2l4" podStartSLOduration=3.661261299 podStartE2EDuration="37.835592944s" podCreationTimestamp="2026-04-16 18:31:34 +0000 UTC" firstStartedPulling="2026-04-16 18:31:34.816615659 +0000 UTC m=+20.941143299" lastFinishedPulling="2026-04-16 18:32:08.990947288 +0000 UTC m=+55.115474944" observedRunningTime="2026-04-16 18:32:11.835438421 +0000 UTC m=+57.959966085" watchObservedRunningTime="2026-04-16 18:32:11.835592944 +0000 UTC m=+57.960120617" Apr 16 18:32:11.835898 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:11.835863 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fphbb" podStartSLOduration=1.683290155 podStartE2EDuration="7.835852123s" podCreationTimestamp="2026-04-16 18:32:04 +0000 UTC" firstStartedPulling="2026-04-16 18:32:04.932799385 +0000 UTC m=+51.057327040" lastFinishedPulling="2026-04-16 18:32:11.085361364 +0000 UTC m=+57.209889008" observedRunningTime="2026-04-16 18:32:11.811603537 +0000 UTC m=+57.936131201" watchObservedRunningTime="2026-04-16 18:32:11.835852123 +0000 UTC m=+57.960379786" Apr 16 18:32:12.798042 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:12.798005 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cl74p" event={"ID":"697ddfb3-adc9-4a63-b5ca-b4b871946a33","Type":"ContainerStarted","Data":"fd014b65019c73d71031d2aa764388bb50025718598635f826fad16c3509abe0"} Apr 16 18:32:12.799384 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:12.799356 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-q67qs" event={"ID":"5f3c3f50-8ee2-4775-a3de-64e723a55361","Type":"ContainerStarted","Data":"8879a53ca545b28705acc5fc5de05a4422924ae147c72cec6eab28468aba74a0"} Apr 16 18:32:12.799521 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:12.799409 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:32:12.816773 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:12.816707 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cl74p" podStartSLOduration=36.987675168 podStartE2EDuration="38.816692197s" podCreationTimestamp="2026-04-16 18:31:34 +0000 UTC" firstStartedPulling="2026-04-16 18:32:09.252806969 +0000 UTC m=+55.377334617" lastFinishedPulling="2026-04-16 18:32:11.081823999 +0000 UTC m=+57.206351646" observedRunningTime="2026-04-16 18:32:12.81518406 +0000 UTC m=+58.939711723" watchObservedRunningTime="2026-04-16 18:32:12.816692197 +0000 UTC m=+58.941219861" Apr 16 18:32:12.834956 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:12.834903 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-q67qs" podStartSLOduration=35.843744133 podStartE2EDuration="38.834887512s" podCreationTimestamp="2026-04-16 18:31:34 +0000 UTC" firstStartedPulling="2026-04-16 18:32:09.24606751 +0000 UTC m=+55.370595150" lastFinishedPulling="2026-04-16 18:32:12.237210886 +0000 UTC m=+58.361738529" observedRunningTime="2026-04-16 18:32:12.834047173 +0000 UTC m=+58.958574838" watchObservedRunningTime="2026-04-16 18:32:12.834887512 +0000 UTC m=+58.959415175" Apr 16 18:32:20.796775 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:20.796719 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-g54vn" Apr 16 18:32:29.732704 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:29.732675 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-78v92" Apr 16 18:32:43.807215 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:32:43.807086 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-q67qs" Apr 16 18:33:16.816208 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.816171 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-d85f58867-lj8hr"] Apr 16 18:33:16.818287 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.818265 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.821469 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.821443 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 18:33:16.821606 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.821546 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-ws25f\"" Apr 16 18:33:16.821606 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.821546 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 18:33:16.821606 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.821601 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 18:33:16.821794 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.821641 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 18:33:16.821794 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.821711 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 18:33:16.827364 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.827343 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 18:33:16.840140 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.840110 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-d85f58867-lj8hr"] Apr 16 18:33:16.886931 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.886899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7k7k\" (UniqueName: \"kubernetes.io/projected/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-kube-api-access-x7k7k\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.887117 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.886948 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-telemeter-client-tls\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.887117 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.886967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-federate-client-tls\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.887117 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.887020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-secret-telemeter-client\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.887117 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.887064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.887117 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.887092 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-serving-certs-ca-bundle\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.887273 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.887122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.887273 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.887144 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-metrics-client-ca\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.987470 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.987434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7k7k\" (UniqueName: \"kubernetes.io/projected/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-kube-api-access-x7k7k\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.987632 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.987487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-telemeter-client-tls\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.987632 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.987509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-federate-client-tls\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.987632 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.987527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-secret-telemeter-client\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.987632 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.987544 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.987632 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.987563 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-serving-certs-ca-bundle\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.987632 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.987603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.987940 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.987636 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-metrics-client-ca\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.988505 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.988465 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-serving-certs-ca-bundle\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.988636 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.988532 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-metrics-client-ca\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.988636 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.988571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.991755 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.991713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-telemeter-client-tls\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.991867 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.991819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-secret-telemeter-client\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.991867 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.991850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-federate-client-tls\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.992165 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.992146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:16.998397 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:16.998365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7k7k\" (UniqueName: \"kubernetes.io/projected/d9a09d8f-414f-4cdb-9b64-6912ca4cfff7-kube-api-access-x7k7k\") pod \"telemeter-client-d85f58867-lj8hr\" (UID: \"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7\") " pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:17.127578 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:17.127473 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" Apr 16 18:33:17.260152 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:17.260124 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-d85f58867-lj8hr"] Apr 16 18:33:17.963969 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:17.963936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" event={"ID":"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7","Type":"ContainerStarted","Data":"6275215d7692e34cd6e593b45488f94bc5a7e307551046c77f435c915239fae3"} Apr 16 18:33:18.967389 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:18.967354 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" event={"ID":"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7","Type":"ContainerStarted","Data":"55fc296d16a4d276ca1ba5c136f8889ecf87baf9c6724a885a9cdaffaf12ece2"} Apr 16 18:33:19.972700 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:19.972657 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" event={"ID":"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7","Type":"ContainerStarted","Data":"c8482afb64472cd57c5610ae7bcdbf14da3cb617787667ce0a8c15f115363f8f"} Apr 16 18:33:19.972700 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:19.972694 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" event={"ID":"d9a09d8f-414f-4cdb-9b64-6912ca4cfff7","Type":"ContainerStarted","Data":"7406cc735a06d9d6353b59814625315e3e81370baa06c6674122e39564a400ab"} Apr 16 18:33:19.999389 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:33:19.999280 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-d85f58867-lj8hr" podStartSLOduration=1.520673719 podStartE2EDuration="3.999265287s" podCreationTimestamp="2026-04-16 18:33:16 +0000 UTC" firstStartedPulling="2026-04-16 18:33:17.263786519 +0000 UTC m=+123.388314160" lastFinishedPulling="2026-04-16 18:33:19.742378087 +0000 UTC m=+125.866905728" observedRunningTime="2026-04-16 18:33:19.99729414 +0000 UTC m=+126.121821804" watchObservedRunningTime="2026-04-16 18:33:19.999265287 +0000 UTC m=+126.123792951" Apr 16 18:34:13.487821 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.487787 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gvvsb"] Apr 16 18:34:13.490653 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.490636 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gvvsb" Apr 16 18:34:13.493258 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.493241 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:34:13.501115 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.501090 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gvvsb"] Apr 16 18:34:13.562041 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.562002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a21bddd4-aa6a-4559-aa1a-16b654ad1b17-kubelet-config\") pod \"global-pull-secret-syncer-gvvsb\" (UID: \"a21bddd4-aa6a-4559-aa1a-16b654ad1b17\") " pod="kube-system/global-pull-secret-syncer-gvvsb" Apr 16 18:34:13.562243 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.562078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a21bddd4-aa6a-4559-aa1a-16b654ad1b17-original-pull-secret\") pod \"global-pull-secret-syncer-gvvsb\" (UID: \"a21bddd4-aa6a-4559-aa1a-16b654ad1b17\") " pod="kube-system/global-pull-secret-syncer-gvvsb" Apr 16 18:34:13.562243 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.562138 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a21bddd4-aa6a-4559-aa1a-16b654ad1b17-dbus\") pod \"global-pull-secret-syncer-gvvsb\" (UID: \"a21bddd4-aa6a-4559-aa1a-16b654ad1b17\") " pod="kube-system/global-pull-secret-syncer-gvvsb" Apr 16 18:34:13.663373 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.663322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a21bddd4-aa6a-4559-aa1a-16b654ad1b17-kubelet-config\") pod \"global-pull-secret-syncer-gvvsb\" (UID: \"a21bddd4-aa6a-4559-aa1a-16b654ad1b17\") " pod="kube-system/global-pull-secret-syncer-gvvsb" Apr 16 18:34:13.663532 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.663405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a21bddd4-aa6a-4559-aa1a-16b654ad1b17-original-pull-secret\") pod \"global-pull-secret-syncer-gvvsb\" (UID: \"a21bddd4-aa6a-4559-aa1a-16b654ad1b17\") " pod="kube-system/global-pull-secret-syncer-gvvsb" Apr 16 18:34:13.663532 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.663434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a21bddd4-aa6a-4559-aa1a-16b654ad1b17-dbus\") pod \"global-pull-secret-syncer-gvvsb\" (UID: \"a21bddd4-aa6a-4559-aa1a-16b654ad1b17\") " pod="kube-system/global-pull-secret-syncer-gvvsb" Apr 16 18:34:13.663532 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.663452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a21bddd4-aa6a-4559-aa1a-16b654ad1b17-kubelet-config\") pod \"global-pull-secret-syncer-gvvsb\" (UID: \"a21bddd4-aa6a-4559-aa1a-16b654ad1b17\") " pod="kube-system/global-pull-secret-syncer-gvvsb" Apr 16 18:34:13.663668 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.663584 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a21bddd4-aa6a-4559-aa1a-16b654ad1b17-dbus\") pod \"global-pull-secret-syncer-gvvsb\" (UID: \"a21bddd4-aa6a-4559-aa1a-16b654ad1b17\") " pod="kube-system/global-pull-secret-syncer-gvvsb" Apr 16 18:34:13.665642 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.665624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a21bddd4-aa6a-4559-aa1a-16b654ad1b17-original-pull-secret\") pod \"global-pull-secret-syncer-gvvsb\" (UID: \"a21bddd4-aa6a-4559-aa1a-16b654ad1b17\") " pod="kube-system/global-pull-secret-syncer-gvvsb" Apr 16 18:34:13.799407 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.799307 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gvvsb" Apr 16 18:34:13.922438 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:13.922407 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gvvsb"] Apr 16 18:34:13.925290 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:34:13.925258 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda21bddd4_aa6a_4559_aa1a_16b654ad1b17.slice/crio-e86bacaae352f0d06755d4b54457460820beba5a99c2eea42e0ed62449d03a5f WatchSource:0}: Error finding container e86bacaae352f0d06755d4b54457460820beba5a99c2eea42e0ed62449d03a5f: Status 404 returned error can't find the container with id e86bacaae352f0d06755d4b54457460820beba5a99c2eea42e0ed62449d03a5f Apr 16 18:34:14.106938 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:14.106848 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gvvsb" event={"ID":"a21bddd4-aa6a-4559-aa1a-16b654ad1b17","Type":"ContainerStarted","Data":"e86bacaae352f0d06755d4b54457460820beba5a99c2eea42e0ed62449d03a5f"} Apr 16 18:34:18.120121 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:18.120035 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gvvsb" event={"ID":"a21bddd4-aa6a-4559-aa1a-16b654ad1b17","Type":"ContainerStarted","Data":"f71d49eea88fa76b003f26e320f9971fd7187139b38bea8e9ae802218964ebc9"} Apr 16 18:34:18.140238 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:18.140180 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gvvsb" podStartSLOduration=1.3213921339999999 podStartE2EDuration="5.140160663s" podCreationTimestamp="2026-04-16 18:34:13 +0000 UTC" firstStartedPulling="2026-04-16 18:34:13.926775874 +0000 UTC m=+180.051303518" lastFinishedPulling="2026-04-16 18:34:17.745544406 +0000 UTC m=+183.870072047" observedRunningTime="2026-04-16 18:34:18.13897418 +0000 UTC m=+184.263501844" watchObservedRunningTime="2026-04-16 18:34:18.140160663 +0000 UTC m=+184.264688373" Apr 16 18:34:33.332704 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.332667 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh"] Apr 16 18:34:33.335670 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.335655 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" Apr 16 18:34:33.338622 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.338598 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:34:33.338622 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.338618 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:34:33.340319 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.340295 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-kpx8z\"" Apr 16 18:34:33.345712 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.345690 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh"] Apr 16 18:34:33.401998 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.401963 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d813931f-48b1-4483-94e0-bb33e6c3550c-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh\" (UID: \"d813931f-48b1-4483-94e0-bb33e6c3550c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" Apr 16 18:34:33.401998 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.402002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5smwx\" (UniqueName: \"kubernetes.io/projected/d813931f-48b1-4483-94e0-bb33e6c3550c-kube-api-access-5smwx\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh\" (UID: \"d813931f-48b1-4483-94e0-bb33e6c3550c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" Apr 16 18:34:33.402221 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.402024 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d813931f-48b1-4483-94e0-bb33e6c3550c-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh\" (UID: \"d813931f-48b1-4483-94e0-bb33e6c3550c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" Apr 16 18:34:33.502790 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.502729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d813931f-48b1-4483-94e0-bb33e6c3550c-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh\" (UID: \"d813931f-48b1-4483-94e0-bb33e6c3550c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" Apr 16 18:34:33.502790 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.502793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5smwx\" (UniqueName: \"kubernetes.io/projected/d813931f-48b1-4483-94e0-bb33e6c3550c-kube-api-access-5smwx\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh\" (UID: \"d813931f-48b1-4483-94e0-bb33e6c3550c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" Apr 16 18:34:33.502975 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.502815 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d813931f-48b1-4483-94e0-bb33e6c3550c-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh\" (UID: \"d813931f-48b1-4483-94e0-bb33e6c3550c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" Apr 16 18:34:33.503090 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.503071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d813931f-48b1-4483-94e0-bb33e6c3550c-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh\" (UID: \"d813931f-48b1-4483-94e0-bb33e6c3550c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" Apr 16 18:34:33.503146 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.503098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d813931f-48b1-4483-94e0-bb33e6c3550c-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh\" (UID: \"d813931f-48b1-4483-94e0-bb33e6c3550c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" Apr 16 18:34:33.518356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.518324 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5smwx\" (UniqueName: \"kubernetes.io/projected/d813931f-48b1-4483-94e0-bb33e6c3550c-kube-api-access-5smwx\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh\" (UID: \"d813931f-48b1-4483-94e0-bb33e6c3550c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" Apr 16 18:34:33.644614 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.644521 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" Apr 16 18:34:33.771398 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:33.771371 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh"] Apr 16 18:34:33.773165 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:34:33.773134 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd813931f_48b1_4483_94e0_bb33e6c3550c.slice/crio-1014ce9068de3b7e365c813579d0f4aa97f3b054460bc58ad59403f481d5470d WatchSource:0}: Error finding container 1014ce9068de3b7e365c813579d0f4aa97f3b054460bc58ad59403f481d5470d: Status 404 returned error can't find the container with id 1014ce9068de3b7e365c813579d0f4aa97f3b054460bc58ad59403f481d5470d Apr 16 18:34:34.161340 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:34.161302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" event={"ID":"d813931f-48b1-4483-94e0-bb33e6c3550c","Type":"ContainerStarted","Data":"1014ce9068de3b7e365c813579d0f4aa97f3b054460bc58ad59403f481d5470d"} Apr 16 18:34:39.175921 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:39.175881 2576 generic.go:358] "Generic (PLEG): container finished" podID="d813931f-48b1-4483-94e0-bb33e6c3550c" containerID="0897bf8037b48dad330e4dde05a0808ae9b885f12d75323794e7f3c944a0951a" exitCode=0 Apr 16 18:34:39.176314 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:39.175931 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" event={"ID":"d813931f-48b1-4483-94e0-bb33e6c3550c","Type":"ContainerDied","Data":"0897bf8037b48dad330e4dde05a0808ae9b885f12d75323794e7f3c944a0951a"} Apr 16 18:34:41.183672 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:41.183626 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" event={"ID":"d813931f-48b1-4483-94e0-bb33e6c3550c","Type":"ContainerStarted","Data":"22b5843506784f8531b92e6edd9298406a205301f68d79062a4f43043e79d092"} Apr 16 18:34:42.187460 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:42.187422 2576 generic.go:358] "Generic (PLEG): container finished" podID="d813931f-48b1-4483-94e0-bb33e6c3550c" containerID="22b5843506784f8531b92e6edd9298406a205301f68d79062a4f43043e79d092" exitCode=0 Apr 16 18:34:42.187865 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:42.187512 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" event={"ID":"d813931f-48b1-4483-94e0-bb33e6c3550c","Type":"ContainerDied","Data":"22b5843506784f8531b92e6edd9298406a205301f68d79062a4f43043e79d092"} Apr 16 18:34:48.207226 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:48.207183 2576 generic.go:358] "Generic (PLEG): container finished" podID="d813931f-48b1-4483-94e0-bb33e6c3550c" containerID="a5df6e5c5e62f7d85df77c9dfe8e34d068ff1c4f2c32da098bd3029479b695d6" exitCode=0 Apr 16 18:34:48.207618 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:48.207271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" event={"ID":"d813931f-48b1-4483-94e0-bb33e6c3550c","Type":"ContainerDied","Data":"a5df6e5c5e62f7d85df77c9dfe8e34d068ff1c4f2c32da098bd3029479b695d6"} Apr 16 18:34:49.327612 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:49.327589 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" Apr 16 18:34:49.428410 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:49.428371 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5smwx\" (UniqueName: \"kubernetes.io/projected/d813931f-48b1-4483-94e0-bb33e6c3550c-kube-api-access-5smwx\") pod \"d813931f-48b1-4483-94e0-bb33e6c3550c\" (UID: \"d813931f-48b1-4483-94e0-bb33e6c3550c\") " Apr 16 18:34:49.428588 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:49.428431 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d813931f-48b1-4483-94e0-bb33e6c3550c-bundle\") pod \"d813931f-48b1-4483-94e0-bb33e6c3550c\" (UID: \"d813931f-48b1-4483-94e0-bb33e6c3550c\") " Apr 16 18:34:49.428588 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:49.428453 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d813931f-48b1-4483-94e0-bb33e6c3550c-util\") pod \"d813931f-48b1-4483-94e0-bb33e6c3550c\" (UID: \"d813931f-48b1-4483-94e0-bb33e6c3550c\") " Apr 16 18:34:49.429014 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:49.428984 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d813931f-48b1-4483-94e0-bb33e6c3550c-bundle" (OuterVolumeSpecName: "bundle") pod "d813931f-48b1-4483-94e0-bb33e6c3550c" (UID: "d813931f-48b1-4483-94e0-bb33e6c3550c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:34:49.430597 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:49.430570 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d813931f-48b1-4483-94e0-bb33e6c3550c-kube-api-access-5smwx" (OuterVolumeSpecName: "kube-api-access-5smwx") pod "d813931f-48b1-4483-94e0-bb33e6c3550c" (UID: "d813931f-48b1-4483-94e0-bb33e6c3550c"). InnerVolumeSpecName "kube-api-access-5smwx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:34:49.432858 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:49.432838 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d813931f-48b1-4483-94e0-bb33e6c3550c-util" (OuterVolumeSpecName: "util") pod "d813931f-48b1-4483-94e0-bb33e6c3550c" (UID: "d813931f-48b1-4483-94e0-bb33e6c3550c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:34:49.529843 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:49.529736 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d813931f-48b1-4483-94e0-bb33e6c3550c-util\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:34:49.529843 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:49.529802 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5smwx\" (UniqueName: \"kubernetes.io/projected/d813931f-48b1-4483-94e0-bb33e6c3550c-kube-api-access-5smwx\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:34:49.529843 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:49.529813 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d813931f-48b1-4483-94e0-bb33e6c3550c-bundle\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:34:50.217322 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:50.217290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" event={"ID":"d813931f-48b1-4483-94e0-bb33e6c3550c","Type":"ContainerDied","Data":"1014ce9068de3b7e365c813579d0f4aa97f3b054460bc58ad59403f481d5470d"} Apr 16 18:34:50.217322 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:50.217320 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1014ce9068de3b7e365c813579d0f4aa97f3b054460bc58ad59403f481d5470d" Apr 16 18:34:50.217523 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:50.217361 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4sjwh" Apr 16 18:34:55.664520 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.664482 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd"] Apr 16 18:34:55.665021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.664844 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d813931f-48b1-4483-94e0-bb33e6c3550c" containerName="extract" Apr 16 18:34:55.665021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.664861 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d813931f-48b1-4483-94e0-bb33e6c3550c" containerName="extract" Apr 16 18:34:55.665021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.664877 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d813931f-48b1-4483-94e0-bb33e6c3550c" containerName="util" Apr 16 18:34:55.665021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.664885 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d813931f-48b1-4483-94e0-bb33e6c3550c" containerName="util" Apr 16 18:34:55.665021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.664902 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d813931f-48b1-4483-94e0-bb33e6c3550c" containerName="pull" Apr 16 18:34:55.665021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.664910 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d813931f-48b1-4483-94e0-bb33e6c3550c" containerName="pull" Apr 16 18:34:55.665021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.664964 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d813931f-48b1-4483-94e0-bb33e6c3550c" containerName="extract" Apr 16 18:34:55.671621 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.671597 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd" Apr 16 18:34:55.676462 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.676436 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:34:55.677879 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.677519 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:34:55.678086 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.678067 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-wlhqf\"" Apr 16 18:34:55.680272 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.680256 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:34:55.716702 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.716664 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd"] Apr 16 18:34:55.778576 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.778536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd\" (UID: \"9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd" Apr 16 18:34:55.778576 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.778579 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjn9w\" (UniqueName: \"kubernetes.io/projected/9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd-kube-api-access-gjn9w\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd\" (UID: \"9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd" Apr 16 18:34:55.879169 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.879132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjn9w\" (UniqueName: \"kubernetes.io/projected/9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd-kube-api-access-gjn9w\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd\" (UID: \"9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd" Apr 16 18:34:55.879338 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.879283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd\" (UID: \"9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd" Apr 16 18:34:55.881568 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.881547 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd\" (UID: \"9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd" Apr 16 18:34:55.889935 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.889912 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjn9w\" (UniqueName: \"kubernetes.io/projected/9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd-kube-api-access-gjn9w\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd\" (UID: \"9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd" Apr 16 18:34:55.981005 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:55.980977 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd" Apr 16 18:34:56.112401 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:56.112365 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd"] Apr 16 18:34:56.116347 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:34:56.116316 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8cfe9e_d559_493e_b9ec_8e1315bcd3fd.slice/crio-042705b5d8d6b60864c38116ca450e40ddfdfdbc73903eb4c0929d6231b715d9 WatchSource:0}: Error finding container 042705b5d8d6b60864c38116ca450e40ddfdfdbc73903eb4c0929d6231b715d9: Status 404 returned error can't find the container with id 042705b5d8d6b60864c38116ca450e40ddfdfdbc73903eb4c0929d6231b715d9 Apr 16 18:34:56.234599 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:56.234519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd" event={"ID":"9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd","Type":"ContainerStarted","Data":"042705b5d8d6b60864c38116ca450e40ddfdfdbc73903eb4c0929d6231b715d9"} Apr 16 18:34:59.244582 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:59.244550 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd" event={"ID":"9b8cfe9e-d559-493e-b9ec-8e1315bcd3fd","Type":"ContainerStarted","Data":"91247bfed9eeab9dd870fb27443aa29f20c77619664397b2098fea00b52231e5"} Apr 16 18:34:59.712336 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:59.712300 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5bwxf"] Apr 16 18:34:59.715390 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:59.715371 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:34:59.718134 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:59.718106 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 18:34:59.718262 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:59.718106 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:34:59.718321 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:59.718257 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-rtmwk\"" Apr 16 18:34:59.726567 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:59.726522 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5bwxf"] Apr 16 18:34:59.908155 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:59.908113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mp5l\" (UniqueName: \"kubernetes.io/projected/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-kube-api-access-2mp5l\") pod \"keda-operator-ffbb595cb-5bwxf\" (UID: \"ff6adbd4-111e-45a8-bf1b-c671a00e9faf\") " pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:34:59.908155 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:59.908157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-cabundle0\") pod \"keda-operator-ffbb595cb-5bwxf\" (UID: \"ff6adbd4-111e-45a8-bf1b-c671a00e9faf\") " pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:34:59.908385 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:34:59.908176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-certificates\") pod \"keda-operator-ffbb595cb-5bwxf\" (UID: \"ff6adbd4-111e-45a8-bf1b-c671a00e9faf\") " pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:35:00.009432 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.009335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mp5l\" (UniqueName: \"kubernetes.io/projected/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-kube-api-access-2mp5l\") pod \"keda-operator-ffbb595cb-5bwxf\" (UID: \"ff6adbd4-111e-45a8-bf1b-c671a00e9faf\") " pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:35:00.009432 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.009392 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-cabundle0\") pod \"keda-operator-ffbb595cb-5bwxf\" (UID: \"ff6adbd4-111e-45a8-bf1b-c671a00e9faf\") " pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:35:00.009432 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.009421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-certificates\") pod \"keda-operator-ffbb595cb-5bwxf\" (UID: \"ff6adbd4-111e-45a8-bf1b-c671a00e9faf\") " pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:35:00.009729 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.009531 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:35:00.009729 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.009547 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:35:00.009729 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.009558 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5bwxf: references non-existent secret key: ca.crt Apr 16 18:35:00.009729 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.009621 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-certificates podName:ff6adbd4-111e-45a8-bf1b-c671a00e9faf nodeName:}" failed. No retries permitted until 2026-04-16 18:35:00.509601025 +0000 UTC m=+226.634128669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-certificates") pod "keda-operator-ffbb595cb-5bwxf" (UID: "ff6adbd4-111e-45a8-bf1b-c671a00e9faf") : references non-existent secret key: ca.crt Apr 16 18:35:00.010127 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.010104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-cabundle0\") pod \"keda-operator-ffbb595cb-5bwxf\" (UID: \"ff6adbd4-111e-45a8-bf1b-c671a00e9faf\") " pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:35:00.034305 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.034273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mp5l\" (UniqueName: \"kubernetes.io/projected/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-kube-api-access-2mp5l\") pod \"keda-operator-ffbb595cb-5bwxf\" (UID: \"ff6adbd4-111e-45a8-bf1b-c671a00e9faf\") " pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:35:00.054008 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.053975 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6"] Apr 16 18:35:00.057057 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.057039 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:00.061917 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.061894 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 18:35:00.079331 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.079297 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6"] Apr 16 18:35:00.109670 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.109630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kfsn6\" (UID: \"0fcfb34e-bcb5-4a05-bf01-17848e9a1c44\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:00.109872 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.109704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jndp7\" (UniqueName: \"kubernetes.io/projected/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-kube-api-access-jndp7\") pod \"keda-metrics-apiserver-7c9f485588-kfsn6\" (UID: \"0fcfb34e-bcb5-4a05-bf01-17848e9a1c44\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:00.109872 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.109774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kfsn6\" (UID: \"0fcfb34e-bcb5-4a05-bf01-17848e9a1c44\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:00.210884 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.210842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jndp7\" (UniqueName: \"kubernetes.io/projected/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-kube-api-access-jndp7\") pod \"keda-metrics-apiserver-7c9f485588-kfsn6\" (UID: \"0fcfb34e-bcb5-4a05-bf01-17848e9a1c44\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:00.211054 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.210921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kfsn6\" (UID: \"0fcfb34e-bcb5-4a05-bf01-17848e9a1c44\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:00.211054 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.210962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kfsn6\" (UID: \"0fcfb34e-bcb5-4a05-bf01-17848e9a1c44\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:00.211168 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.211150 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:35:00.211218 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.211175 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:35:00.211218 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.211197 2576 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 18:35:00.211313 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.211221 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 18:35:00.211313 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.211293 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-certificates podName:0fcfb34e-bcb5-4a05-bf01-17848e9a1c44 nodeName:}" failed. No retries permitted until 2026-04-16 18:35:00.711272213 +0000 UTC m=+226.835799871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-certificates") pod "keda-metrics-apiserver-7c9f485588-kfsn6" (UID: "0fcfb34e-bcb5-4a05-bf01-17848e9a1c44") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 18:35:00.211413 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.211323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kfsn6\" (UID: \"0fcfb34e-bcb5-4a05-bf01-17848e9a1c44\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:00.220592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.220564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jndp7\" (UniqueName: \"kubernetes.io/projected/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-kube-api-access-jndp7\") pod \"keda-metrics-apiserver-7c9f485588-kfsn6\" (UID: \"0fcfb34e-bcb5-4a05-bf01-17848e9a1c44\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:00.249688 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.249658 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd" Apr 16 18:35:00.275240 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.275135 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd" podStartSLOduration=2.212817453 podStartE2EDuration="5.275118661s" podCreationTimestamp="2026-04-16 18:34:55 +0000 UTC" firstStartedPulling="2026-04-16 18:34:56.118694003 +0000 UTC m=+222.243221644" lastFinishedPulling="2026-04-16 18:34:59.180995209 +0000 UTC m=+225.305522852" observedRunningTime="2026-04-16 18:35:00.274291707 +0000 UTC m=+226.398819371" watchObservedRunningTime="2026-04-16 18:35:00.275118661 +0000 UTC m=+226.399646321" Apr 16 18:35:00.396331 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.396298 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-zwcr6"] Apr 16 18:35:00.399755 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.399717 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-zwcr6" Apr 16 18:35:00.403093 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.403072 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 18:35:00.412995 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.412968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9ea28722-eae4-48ac-ae67-cea51bf90572-certificates\") pod \"keda-admission-cf49989db-zwcr6\" (UID: \"9ea28722-eae4-48ac-ae67-cea51bf90572\") " pod="openshift-keda/keda-admission-cf49989db-zwcr6" Apr 16 18:35:00.413160 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.413053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjjvl\" (UniqueName: \"kubernetes.io/projected/9ea28722-eae4-48ac-ae67-cea51bf90572-kube-api-access-bjjvl\") pod \"keda-admission-cf49989db-zwcr6\" (UID: \"9ea28722-eae4-48ac-ae67-cea51bf90572\") " pod="openshift-keda/keda-admission-cf49989db-zwcr6" Apr 16 18:35:00.413160 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.413071 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-zwcr6"] Apr 16 18:35:00.513929 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.513892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9ea28722-eae4-48ac-ae67-cea51bf90572-certificates\") pod \"keda-admission-cf49989db-zwcr6\" (UID: \"9ea28722-eae4-48ac-ae67-cea51bf90572\") " pod="openshift-keda/keda-admission-cf49989db-zwcr6" Apr 16 18:35:00.514132 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.513957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-certificates\") pod \"keda-operator-ffbb595cb-5bwxf\" (UID: \"ff6adbd4-111e-45a8-bf1b-c671a00e9faf\") " pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:35:00.514132 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.513979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjjvl\" (UniqueName: \"kubernetes.io/projected/9ea28722-eae4-48ac-ae67-cea51bf90572-kube-api-access-bjjvl\") pod \"keda-admission-cf49989db-zwcr6\" (UID: \"9ea28722-eae4-48ac-ae67-cea51bf90572\") " pod="openshift-keda/keda-admission-cf49989db-zwcr6" Apr 16 18:35:00.514132 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.514073 2576 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 18:35:00.514132 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.514101 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-zwcr6: secret "keda-admission-webhooks-certs" not found Apr 16 18:35:00.514338 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.514154 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ea28722-eae4-48ac-ae67-cea51bf90572-certificates podName:9ea28722-eae4-48ac-ae67-cea51bf90572 nodeName:}" failed. No retries permitted until 2026-04-16 18:35:01.014135075 +0000 UTC m=+227.138662716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/9ea28722-eae4-48ac-ae67-cea51bf90572-certificates") pod "keda-admission-cf49989db-zwcr6" (UID: "9ea28722-eae4-48ac-ae67-cea51bf90572") : secret "keda-admission-webhooks-certs" not found Apr 16 18:35:00.514338 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.514212 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:35:00.514338 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.514226 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:35:00.514338 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.514238 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5bwxf: references non-existent secret key: ca.crt Apr 16 18:35:00.514338 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.514294 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-certificates podName:ff6adbd4-111e-45a8-bf1b-c671a00e9faf nodeName:}" failed. No retries permitted until 2026-04-16 18:35:01.514277987 +0000 UTC m=+227.638805640 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-certificates") pod "keda-operator-ffbb595cb-5bwxf" (UID: "ff6adbd4-111e-45a8-bf1b-c671a00e9faf") : references non-existent secret key: ca.crt Apr 16 18:35:00.523559 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.523525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjjvl\" (UniqueName: \"kubernetes.io/projected/9ea28722-eae4-48ac-ae67-cea51bf90572-kube-api-access-bjjvl\") pod \"keda-admission-cf49989db-zwcr6\" (UID: \"9ea28722-eae4-48ac-ae67-cea51bf90572\") " pod="openshift-keda/keda-admission-cf49989db-zwcr6" Apr 16 18:35:00.715049 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:00.715009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kfsn6\" (UID: \"0fcfb34e-bcb5-4a05-bf01-17848e9a1c44\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:00.715245 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.715145 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:35:00.715245 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.715165 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:35:00.715245 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.715187 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6: references non-existent secret key: tls.crt Apr 16 18:35:00.715245 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:35:00.715242 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-certificates podName:0fcfb34e-bcb5-4a05-bf01-17848e9a1c44 nodeName:}" failed. No retries permitted until 2026-04-16 18:35:01.715228337 +0000 UTC m=+227.839755978 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-certificates") pod "keda-metrics-apiserver-7c9f485588-kfsn6" (UID: "0fcfb34e-bcb5-4a05-bf01-17848e9a1c44") : references non-existent secret key: tls.crt Apr 16 18:35:01.016870 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:01.016771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9ea28722-eae4-48ac-ae67-cea51bf90572-certificates\") pod \"keda-admission-cf49989db-zwcr6\" (UID: \"9ea28722-eae4-48ac-ae67-cea51bf90572\") " pod="openshift-keda/keda-admission-cf49989db-zwcr6" Apr 16 18:35:01.019268 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:01.019235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9ea28722-eae4-48ac-ae67-cea51bf90572-certificates\") pod \"keda-admission-cf49989db-zwcr6\" (UID: \"9ea28722-eae4-48ac-ae67-cea51bf90572\") " pod="openshift-keda/keda-admission-cf49989db-zwcr6" Apr 16 18:35:01.310814 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:01.310709 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-zwcr6" Apr 16 18:35:01.446425 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:01.446388 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-zwcr6"] Apr 16 18:35:01.449849 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:35:01.449813 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea28722_eae4_48ac_ae67_cea51bf90572.slice/crio-07d4001abeeee453a5db15685d755b77bca80def38f6a31894d0f4a409d0c69b WatchSource:0}: Error finding container 07d4001abeeee453a5db15685d755b77bca80def38f6a31894d0f4a409d0c69b: Status 404 returned error can't find the container with id 07d4001abeeee453a5db15685d755b77bca80def38f6a31894d0f4a409d0c69b Apr 16 18:35:01.520913 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:01.520882 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-certificates\") pod \"keda-operator-ffbb595cb-5bwxf\" (UID: \"ff6adbd4-111e-45a8-bf1b-c671a00e9faf\") " pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:35:01.523301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:01.523275 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ff6adbd4-111e-45a8-bf1b-c671a00e9faf-certificates\") pod \"keda-operator-ffbb595cb-5bwxf\" (UID: \"ff6adbd4-111e-45a8-bf1b-c671a00e9faf\") " pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:35:01.532151 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:01.532118 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:35:01.652610 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:01.652577 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5bwxf"] Apr 16 18:35:01.655811 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:35:01.655781 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff6adbd4_111e_45a8_bf1b_c671a00e9faf.slice/crio-cf22b182099f3a1a0379c1d31ab8965ed5e668eae5c781b741f4cfa13c9d3d94 WatchSource:0}: Error finding container cf22b182099f3a1a0379c1d31ab8965ed5e668eae5c781b741f4cfa13c9d3d94: Status 404 returned error can't find the container with id cf22b182099f3a1a0379c1d31ab8965ed5e668eae5c781b741f4cfa13c9d3d94 Apr 16 18:35:01.722550 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:01.722515 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kfsn6\" (UID: \"0fcfb34e-bcb5-4a05-bf01-17848e9a1c44\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:01.725017 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:01.724985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0fcfb34e-bcb5-4a05-bf01-17848e9a1c44-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kfsn6\" (UID: \"0fcfb34e-bcb5-4a05-bf01-17848e9a1c44\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:01.867590 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:01.867492 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:01.992831 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:01.992791 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6"] Apr 16 18:35:01.995912 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:35:01.995888 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fcfb34e_bcb5_4a05_bf01_17848e9a1c44.slice/crio-84ddf6cdddaff18f6403f2613a4a851a49d5ca81808bab8f1d366a6ba03bbaab WatchSource:0}: Error finding container 84ddf6cdddaff18f6403f2613a4a851a49d5ca81808bab8f1d366a6ba03bbaab: Status 404 returned error can't find the container with id 84ddf6cdddaff18f6403f2613a4a851a49d5ca81808bab8f1d366a6ba03bbaab Apr 16 18:35:02.255773 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:02.255712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" event={"ID":"0fcfb34e-bcb5-4a05-bf01-17848e9a1c44","Type":"ContainerStarted","Data":"84ddf6cdddaff18f6403f2613a4a851a49d5ca81808bab8f1d366a6ba03bbaab"} Apr 16 18:35:02.256842 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:02.256812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-zwcr6" event={"ID":"9ea28722-eae4-48ac-ae67-cea51bf90572","Type":"ContainerStarted","Data":"07d4001abeeee453a5db15685d755b77bca80def38f6a31894d0f4a409d0c69b"} Apr 16 18:35:02.258109 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:02.258069 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" event={"ID":"ff6adbd4-111e-45a8-bf1b-c671a00e9faf","Type":"ContainerStarted","Data":"cf22b182099f3a1a0379c1d31ab8965ed5e668eae5c781b741f4cfa13c9d3d94"} Apr 16 18:35:03.262600 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:03.262558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-zwcr6" event={"ID":"9ea28722-eae4-48ac-ae67-cea51bf90572","Type":"ContainerStarted","Data":"6800d3250992884e90cff008b2381ea0a450c89244155db5582f970c8c147d06"} Apr 16 18:35:03.263101 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:03.262772 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-zwcr6" Apr 16 18:35:03.282397 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:03.282340 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-zwcr6" podStartSLOduration=1.901036607 podStartE2EDuration="3.282320296s" podCreationTimestamp="2026-04-16 18:35:00 +0000 UTC" firstStartedPulling="2026-04-16 18:35:01.451057375 +0000 UTC m=+227.575585015" lastFinishedPulling="2026-04-16 18:35:02.832341049 +0000 UTC m=+228.956868704" observedRunningTime="2026-04-16 18:35:03.280499113 +0000 UTC m=+229.405026777" watchObservedRunningTime="2026-04-16 18:35:03.282320296 +0000 UTC m=+229.406847960" Apr 16 18:35:06.272781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:06.272726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" event={"ID":"0fcfb34e-bcb5-4a05-bf01-17848e9a1c44","Type":"ContainerStarted","Data":"6a6941f71993d983b640c6c2c909e6cc3e5c57a175b686c62d0ae74004af4a1d"} Apr 16 18:35:06.273223 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:06.272881 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:06.274047 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:06.274023 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" event={"ID":"ff6adbd4-111e-45a8-bf1b-c671a00e9faf","Type":"ContainerStarted","Data":"c166c2882dce71a1855195bdf5c23f05c19f72bc5cbf6cdbe7a88a68b5f2bd5e"} Apr 16 18:35:06.274203 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:06.274187 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:35:06.291485 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:06.291431 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" podStartSLOduration=2.670594667 podStartE2EDuration="6.29141708s" podCreationTimestamp="2026-04-16 18:35:00 +0000 UTC" firstStartedPulling="2026-04-16 18:35:01.99735133 +0000 UTC m=+228.121878970" lastFinishedPulling="2026-04-16 18:35:05.618173732 +0000 UTC m=+231.742701383" observedRunningTime="2026-04-16 18:35:06.28978545 +0000 UTC m=+232.414313112" watchObservedRunningTime="2026-04-16 18:35:06.29141708 +0000 UTC m=+232.415944742" Apr 16 18:35:06.307119 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:06.307070 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" podStartSLOduration=3.3462611669999998 podStartE2EDuration="7.307051258s" podCreationTimestamp="2026-04-16 18:34:59 +0000 UTC" firstStartedPulling="2026-04-16 18:35:01.657345698 +0000 UTC m=+227.781873348" lastFinishedPulling="2026-04-16 18:35:05.618135799 +0000 UTC m=+231.742663439" observedRunningTime="2026-04-16 18:35:06.306035037 +0000 UTC m=+232.430562698" watchObservedRunningTime="2026-04-16 18:35:06.307051258 +0000 UTC m=+232.431578920" Apr 16 18:35:17.281576 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:17.281541 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kfsn6" Apr 16 18:35:21.254162 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:21.254123 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rcjrd" Apr 16 18:35:24.268711 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:24.268681 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-zwcr6" Apr 16 18:35:27.279014 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:27.278980 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-5bwxf" Apr 16 18:35:53.624037 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.623957 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s"] Apr 16 18:35:53.632295 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.632268 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" Apr 16 18:35:53.635086 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.635062 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:35:53.636255 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.636235 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-kpx8z\"" Apr 16 18:35:53.636535 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.636324 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:35:53.638130 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.638104 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s"] Apr 16 18:35:53.686141 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.686102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s\" (UID: \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" Apr 16 18:35:53.686141 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.686150 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s\" (UID: \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" Apr 16 18:35:53.686356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.686195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wlmm\" (UniqueName: \"kubernetes.io/projected/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-kube-api-access-9wlmm\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s\" (UID: \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" Apr 16 18:35:53.787190 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.787154 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wlmm\" (UniqueName: \"kubernetes.io/projected/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-kube-api-access-9wlmm\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s\" (UID: \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" Apr 16 18:35:53.787349 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.787214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s\" (UID: \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" Apr 16 18:35:53.787349 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.787246 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s\" (UID: \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" Apr 16 18:35:53.787575 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.787556 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s\" (UID: \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" Apr 16 18:35:53.787633 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.787616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s\" (UID: \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" Apr 16 18:35:53.797399 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.797371 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wlmm\" (UniqueName: \"kubernetes.io/projected/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-kube-api-access-9wlmm\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s\" (UID: \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" Apr 16 18:35:53.943037 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:53.943004 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" Apr 16 18:35:54.084923 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:54.084887 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s"] Apr 16 18:35:54.088404 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:35:54.088375 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96aac2e4_8cfa_4a7b_8805_9f62c01b9401.slice/crio-e0799fce2ac39895b624cf495a54fe0dbba9d432560c8cf60179e4280d10ebdf WatchSource:0}: Error finding container e0799fce2ac39895b624cf495a54fe0dbba9d432560c8cf60179e4280d10ebdf: Status 404 returned error can't find the container with id e0799fce2ac39895b624cf495a54fe0dbba9d432560c8cf60179e4280d10ebdf Apr 16 18:35:54.402421 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:54.402331 2576 generic.go:358] "Generic (PLEG): container finished" podID="96aac2e4-8cfa-4a7b-8805-9f62c01b9401" containerID="d3577e2aab6f47d2d4a8ad171b0b48192347c3d4930de4394532f45b14dc20ea" exitCode=0 Apr 16 18:35:54.402421 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:54.402381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" event={"ID":"96aac2e4-8cfa-4a7b-8805-9f62c01b9401","Type":"ContainerDied","Data":"d3577e2aab6f47d2d4a8ad171b0b48192347c3d4930de4394532f45b14dc20ea"} Apr 16 18:35:54.402421 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:54.402403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" event={"ID":"96aac2e4-8cfa-4a7b-8805-9f62c01b9401","Type":"ContainerStarted","Data":"e0799fce2ac39895b624cf495a54fe0dbba9d432560c8cf60179e4280d10ebdf"} Apr 16 18:35:55.406722 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:55.406687 2576 generic.go:358] "Generic (PLEG): container finished" podID="96aac2e4-8cfa-4a7b-8805-9f62c01b9401" containerID="6cd1d826defe71adc6a4d0cd3b993b1295ceecc21c6dc652e892230e2067ec1b" exitCode=0 Apr 16 18:35:55.407192 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:55.406782 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" event={"ID":"96aac2e4-8cfa-4a7b-8805-9f62c01b9401","Type":"ContainerDied","Data":"6cd1d826defe71adc6a4d0cd3b993b1295ceecc21c6dc652e892230e2067ec1b"} Apr 16 18:35:56.411464 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:56.411421 2576 generic.go:358] "Generic (PLEG): container finished" podID="96aac2e4-8cfa-4a7b-8805-9f62c01b9401" containerID="d2704ac43b6d5b6157209cdd38165e0409dcc29d2588e70acb24cbe0a5f687ee" exitCode=0 Apr 16 18:35:56.411884 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:56.411533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" event={"ID":"96aac2e4-8cfa-4a7b-8805-9f62c01b9401","Type":"ContainerDied","Data":"d2704ac43b6d5b6157209cdd38165e0409dcc29d2588e70acb24cbe0a5f687ee"} Apr 16 18:35:57.535147 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:57.535123 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" Apr 16 18:35:57.615735 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:57.615691 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wlmm\" (UniqueName: \"kubernetes.io/projected/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-kube-api-access-9wlmm\") pod \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\" (UID: \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\") " Apr 16 18:35:57.615899 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:57.615798 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-bundle\") pod \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\" (UID: \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\") " Apr 16 18:35:57.615899 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:57.615827 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-util\") pod \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\" (UID: \"96aac2e4-8cfa-4a7b-8805-9f62c01b9401\") " Apr 16 18:35:57.616400 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:57.616366 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-bundle" (OuterVolumeSpecName: "bundle") pod "96aac2e4-8cfa-4a7b-8805-9f62c01b9401" (UID: "96aac2e4-8cfa-4a7b-8805-9f62c01b9401"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:35:57.617805 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:57.617777 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-kube-api-access-9wlmm" (OuterVolumeSpecName: "kube-api-access-9wlmm") pod "96aac2e4-8cfa-4a7b-8805-9f62c01b9401" (UID: "96aac2e4-8cfa-4a7b-8805-9f62c01b9401"). InnerVolumeSpecName "kube-api-access-9wlmm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:35:57.621323 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:57.621298 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-util" (OuterVolumeSpecName: "util") pod "96aac2e4-8cfa-4a7b-8805-9f62c01b9401" (UID: "96aac2e4-8cfa-4a7b-8805-9f62c01b9401"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:35:57.716351 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:57.716311 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-bundle\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:35:57.716351 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:57.716336 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-util\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:35:57.716351 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:57.716346 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9wlmm\" (UniqueName: \"kubernetes.io/projected/96aac2e4-8cfa-4a7b-8805-9f62c01b9401-kube-api-access-9wlmm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:35:58.418271 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:58.418222 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" event={"ID":"96aac2e4-8cfa-4a7b-8805-9f62c01b9401","Type":"ContainerDied","Data":"e0799fce2ac39895b624cf495a54fe0dbba9d432560c8cf60179e4280d10ebdf"} Apr 16 18:35:58.418271 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:58.418267 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0799fce2ac39895b624cf495a54fe0dbba9d432560c8cf60179e4280d10ebdf" Apr 16 18:35:58.418473 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:35:58.418286 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mvn2s" Apr 16 18:36:01.049323 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.049290 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9"] Apr 16 18:36:01.049694 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.049553 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96aac2e4-8cfa-4a7b-8805-9f62c01b9401" containerName="extract" Apr 16 18:36:01.049694 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.049564 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="96aac2e4-8cfa-4a7b-8805-9f62c01b9401" containerName="extract" Apr 16 18:36:01.049694 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.049578 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96aac2e4-8cfa-4a7b-8805-9f62c01b9401" containerName="pull" Apr 16 18:36:01.049694 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.049583 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="96aac2e4-8cfa-4a7b-8805-9f62c01b9401" containerName="pull" Apr 16 18:36:01.049694 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.049591 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96aac2e4-8cfa-4a7b-8805-9f62c01b9401" containerName="util" Apr 16 18:36:01.049694 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.049596 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="96aac2e4-8cfa-4a7b-8805-9f62c01b9401" containerName="util" Apr 16 18:36:01.049694 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.049636 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="96aac2e4-8cfa-4a7b-8805-9f62c01b9401" containerName="extract" Apr 16 18:36:01.053658 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.053639 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9" Apr 16 18:36:01.056686 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.056662 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-cfbhp\"" Apr 16 18:36:01.056848 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.056831 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:36:01.057094 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.057082 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 18:36:01.067157 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.067132 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9"] Apr 16 18:36:01.140131 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.140083 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klc9r\" (UniqueName: \"kubernetes.io/projected/7dcb6ca3-3e71-4efe-a358-644abf051155-kube-api-access-klc9r\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nlgf9\" (UID: \"7dcb6ca3-3e71-4efe-a358-644abf051155\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9" Apr 16 18:36:01.140320 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.140183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7dcb6ca3-3e71-4efe-a358-644abf051155-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nlgf9\" (UID: \"7dcb6ca3-3e71-4efe-a358-644abf051155\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9" Apr 16 18:36:01.241414 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.241366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klc9r\" (UniqueName: \"kubernetes.io/projected/7dcb6ca3-3e71-4efe-a358-644abf051155-kube-api-access-klc9r\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nlgf9\" (UID: \"7dcb6ca3-3e71-4efe-a358-644abf051155\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9" Apr 16 18:36:01.241414 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.241424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7dcb6ca3-3e71-4efe-a358-644abf051155-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nlgf9\" (UID: \"7dcb6ca3-3e71-4efe-a358-644abf051155\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9" Apr 16 18:36:01.241786 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.241766 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7dcb6ca3-3e71-4efe-a358-644abf051155-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nlgf9\" (UID: \"7dcb6ca3-3e71-4efe-a358-644abf051155\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9" Apr 16 18:36:01.250083 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.250056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klc9r\" (UniqueName: \"kubernetes.io/projected/7dcb6ca3-3e71-4efe-a358-644abf051155-kube-api-access-klc9r\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nlgf9\" (UID: \"7dcb6ca3-3e71-4efe-a358-644abf051155\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9" Apr 16 18:36:01.362690 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.362593 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9" Apr 16 18:36:01.503129 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:01.502979 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9"] Apr 16 18:36:01.506246 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:36:01.506220 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dcb6ca3_3e71_4efe_a358_644abf051155.slice/crio-fed40fb26814b79a2d4137e091098f54832c83f83471aaa88b28c8e4a07e5d3c WatchSource:0}: Error finding container fed40fb26814b79a2d4137e091098f54832c83f83471aaa88b28c8e4a07e5d3c: Status 404 returned error can't find the container with id fed40fb26814b79a2d4137e091098f54832c83f83471aaa88b28c8e4a07e5d3c Apr 16 18:36:02.431486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:02.431447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9" event={"ID":"7dcb6ca3-3e71-4efe-a358-644abf051155","Type":"ContainerStarted","Data":"fed40fb26814b79a2d4137e091098f54832c83f83471aaa88b28c8e4a07e5d3c"} Apr 16 18:36:03.437336 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:03.437302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9" event={"ID":"7dcb6ca3-3e71-4efe-a358-644abf051155","Type":"ContainerStarted","Data":"057cc090f55f24a3bac53e114f287bafaf2e955477031cd2c8265807825164b1"} Apr 16 18:36:03.469113 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:03.469063 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nlgf9" podStartSLOduration=0.734484377 podStartE2EDuration="2.469047921s" podCreationTimestamp="2026-04-16 18:36:01 +0000 UTC" firstStartedPulling="2026-04-16 18:36:01.508516469 +0000 UTC m=+287.633044112" lastFinishedPulling="2026-04-16 18:36:03.243080015 +0000 UTC m=+289.367607656" observedRunningTime="2026-04-16 18:36:03.467909876 +0000 UTC m=+289.592437555" watchObservedRunningTime="2026-04-16 18:36:03.469047921 +0000 UTC m=+289.593575584" Apr 16 18:36:05.328004 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.327966 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp"] Apr 16 18:36:05.330321 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.330299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" Apr 16 18:36:05.334688 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.333405 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:36:05.334688 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.333439 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-kpx8z\"" Apr 16 18:36:05.334688 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.333937 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:36:05.341130 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.341103 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp"] Apr 16 18:36:05.474668 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.474629 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp\" (UID: \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" Apr 16 18:36:05.474882 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.474700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7htx\" (UniqueName: \"kubernetes.io/projected/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-kube-api-access-s7htx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp\" (UID: \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" Apr 16 18:36:05.474882 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.474796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp\" (UID: \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" Apr 16 18:36:05.575884 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.575849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp\" (UID: \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" Apr 16 18:36:05.576037 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.575926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7htx\" (UniqueName: \"kubernetes.io/projected/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-kube-api-access-s7htx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp\" (UID: \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" Apr 16 18:36:05.576037 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.575959 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp\" (UID: \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" Apr 16 18:36:05.576239 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.576220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp\" (UID: \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" Apr 16 18:36:05.576306 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.576286 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp\" (UID: \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" Apr 16 18:36:05.584906 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.584839 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7htx\" (UniqueName: \"kubernetes.io/projected/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-kube-api-access-s7htx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp\" (UID: \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" Apr 16 18:36:05.641573 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.641532 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" Apr 16 18:36:05.764431 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:05.764397 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp"] Apr 16 18:36:05.767433 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:36:05.767408 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2057d92a_bf18_4f3f_aeb7_ec19bb9d1d03.slice/crio-819fa8375fe996f1b6fc9eec2da1bd04ea24d5d4761322543d87a4371a035894 WatchSource:0}: Error finding container 819fa8375fe996f1b6fc9eec2da1bd04ea24d5d4761322543d87a4371a035894: Status 404 returned error can't find the container with id 819fa8375fe996f1b6fc9eec2da1bd04ea24d5d4761322543d87a4371a035894 Apr 16 18:36:06.447989 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:06.447957 2576 generic.go:358] "Generic (PLEG): container finished" podID="2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03" containerID="ffceecb10a7d09a807acbac0b723783f04089e0eaa898e632d80d6fd7c3ddc38" exitCode=0 Apr 16 18:36:06.448368 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:06.448046 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" event={"ID":"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03","Type":"ContainerDied","Data":"ffceecb10a7d09a807acbac0b723783f04089e0eaa898e632d80d6fd7c3ddc38"} Apr 16 18:36:06.448368 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:06.448078 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" event={"ID":"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03","Type":"ContainerStarted","Data":"819fa8375fe996f1b6fc9eec2da1bd04ea24d5d4761322543d87a4371a035894"} Apr 16 18:36:06.950665 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:06.950630 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-pclnl"] Apr 16 18:36:06.952754 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:06.952729 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-pclnl" Apr 16 18:36:06.955424 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:06.955402 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 18:36:06.956555 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:06.956539 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-49swz\"" Apr 16 18:36:06.956555 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:06.956552 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 18:36:06.962606 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:06.962577 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-pclnl"] Apr 16 18:36:07.088231 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:07.088195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5rkn\" (UniqueName: \"kubernetes.io/projected/bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60-kube-api-access-r5rkn\") pod \"cert-manager-webhook-597b96b99b-pclnl\" (UID: \"bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pclnl" Apr 16 18:36:07.088422 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:07.088255 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-pclnl\" (UID: \"bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pclnl" Apr 16 18:36:07.189238 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:07.189198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5rkn\" (UniqueName: \"kubernetes.io/projected/bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60-kube-api-access-r5rkn\") pod \"cert-manager-webhook-597b96b99b-pclnl\" (UID: \"bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pclnl" Apr 16 18:36:07.189238 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:07.189248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-pclnl\" (UID: \"bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pclnl" Apr 16 18:36:07.198710 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:07.198665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5rkn\" (UniqueName: \"kubernetes.io/projected/bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60-kube-api-access-r5rkn\") pod \"cert-manager-webhook-597b96b99b-pclnl\" (UID: \"bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pclnl" Apr 16 18:36:07.198977 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:07.198947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-pclnl\" (UID: \"bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pclnl" Apr 16 18:36:07.275373 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:07.275282 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-pclnl" Apr 16 18:36:07.416353 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:07.416317 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-pclnl"] Apr 16 18:36:07.420161 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:36:07.420131 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbaaa2e3_49e8_4146_a9db_b80bb4bb2d60.slice/crio-863dba15879147f2e80c51e097083ebe6d538968d45fbb7213b0487fdfb52092 WatchSource:0}: Error finding container 863dba15879147f2e80c51e097083ebe6d538968d45fbb7213b0487fdfb52092: Status 404 returned error can't find the container with id 863dba15879147f2e80c51e097083ebe6d538968d45fbb7213b0487fdfb52092 Apr 16 18:36:07.452453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:07.452404 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-pclnl" event={"ID":"bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60","Type":"ContainerStarted","Data":"863dba15879147f2e80c51e097083ebe6d538968d45fbb7213b0487fdfb52092"} Apr 16 18:36:09.461280 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:09.461237 2576 generic.go:358] "Generic (PLEG): container finished" podID="2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03" containerID="e455df46bf2bd05d30fc800e0ddc6d424f1cd793735d60e3ec0ab2f93a3f2f6e" exitCode=0 Apr 16 18:36:09.461729 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:09.461312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" event={"ID":"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03","Type":"ContainerDied","Data":"e455df46bf2bd05d30fc800e0ddc6d424f1cd793735d60e3ec0ab2f93a3f2f6e"} Apr 16 18:36:10.468998 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:10.468964 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" event={"ID":"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03","Type":"ContainerStarted","Data":"d58b44b38da23c4d31df4c0753a73f6f107581b7789c606836b55b3ddbd9bfa0"} Apr 16 18:36:10.493307 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:10.493255 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" podStartSLOduration=3.541696503 podStartE2EDuration="5.493238456s" podCreationTimestamp="2026-04-16 18:36:05 +0000 UTC" firstStartedPulling="2026-04-16 18:36:06.449024747 +0000 UTC m=+292.573552388" lastFinishedPulling="2026-04-16 18:36:08.400566691 +0000 UTC m=+294.525094341" observedRunningTime="2026-04-16 18:36:10.490639278 +0000 UTC m=+296.615166940" watchObservedRunningTime="2026-04-16 18:36:10.493238456 +0000 UTC m=+296.617766118" Apr 16 18:36:11.473802 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:11.473767 2576 generic.go:358] "Generic (PLEG): container finished" podID="2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03" containerID="d58b44b38da23c4d31df4c0753a73f6f107581b7789c606836b55b3ddbd9bfa0" exitCode=0 Apr 16 18:36:11.474216 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:11.473847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" event={"ID":"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03","Type":"ContainerDied","Data":"d58b44b38da23c4d31df4c0753a73f6f107581b7789c606836b55b3ddbd9bfa0"} Apr 16 18:36:11.475078 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:11.475056 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-pclnl" event={"ID":"bbaaa2e3-49e8-4146-a9db-b80bb4bb2d60","Type":"ContainerStarted","Data":"f2faf1c224915a22ea107bce9eb205eb6f9e1072405b110a4af415ae7af07aa6"} Apr 16 18:36:11.475179 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:11.475167 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-pclnl" Apr 16 18:36:11.544557 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:11.544502 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-pclnl" podStartSLOduration=2.428993168 podStartE2EDuration="5.544486452s" podCreationTimestamp="2026-04-16 18:36:06 +0000 UTC" firstStartedPulling="2026-04-16 18:36:07.422425685 +0000 UTC m=+293.546953330" lastFinishedPulling="2026-04-16 18:36:10.537918968 +0000 UTC m=+296.662446614" observedRunningTime="2026-04-16 18:36:11.543804061 +0000 UTC m=+297.668331724" watchObservedRunningTime="2026-04-16 18:36:11.544486452 +0000 UTC m=+297.669014112" Apr 16 18:36:12.600103 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:12.600080 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" Apr 16 18:36:12.736557 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:12.736469 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-bundle\") pod \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\" (UID: \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\") " Apr 16 18:36:12.736702 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:12.736576 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7htx\" (UniqueName: \"kubernetes.io/projected/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-kube-api-access-s7htx\") pod \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\" (UID: \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\") " Apr 16 18:36:12.736702 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:12.736611 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-util\") pod \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\" (UID: \"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03\") " Apr 16 18:36:12.737007 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:12.736983 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-bundle" (OuterVolumeSpecName: "bundle") pod "2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03" (UID: "2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:36:12.738650 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:12.738620 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-kube-api-access-s7htx" (OuterVolumeSpecName: "kube-api-access-s7htx") pod "2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03" (UID: "2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03"). InnerVolumeSpecName "kube-api-access-s7htx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:36:12.741192 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:12.741165 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-util" (OuterVolumeSpecName: "util") pod "2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03" (UID: "2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:36:12.837356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:12.837305 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s7htx\" (UniqueName: \"kubernetes.io/projected/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-kube-api-access-s7htx\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:36:12.837356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:12.837348 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-util\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:36:12.837356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:12.837360 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03-bundle\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:36:13.483510 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:13.483476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" event={"ID":"2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03","Type":"ContainerDied","Data":"819fa8375fe996f1b6fc9eec2da1bd04ea24d5d4761322543d87a4371a035894"} Apr 16 18:36:13.483510 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:13.483508 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="819fa8375fe996f1b6fc9eec2da1bd04ea24d5d4761322543d87a4371a035894" Apr 16 18:36:13.483510 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:13.483515 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fw5csp" Apr 16 18:36:14.393308 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:14.393276 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/2.log" Apr 16 18:36:14.393783 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:14.393729 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/2.log" Apr 16 18:36:14.395690 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:14.395674 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:36:17.307309 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.307270 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-g2q6r"] Apr 16 18:36:17.310318 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.307531 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03" containerName="util" Apr 16 18:36:17.310318 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.307542 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03" containerName="util" Apr 16 18:36:17.310318 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.307549 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03" containerName="extract" Apr 16 18:36:17.310318 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.307556 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03" containerName="extract" Apr 16 18:36:17.310318 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.307565 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03" containerName="pull" Apr 16 18:36:17.310318 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.307571 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03" containerName="pull" Apr 16 18:36:17.310318 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.307612 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2057d92a-bf18-4f3f-aeb7-ec19bb9d1d03" containerName="extract" Apr 16 18:36:17.310855 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.310839 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-g2q6r" Apr 16 18:36:17.313535 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.313511 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-4qhgx\"" Apr 16 18:36:17.323803 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.323773 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-g2q6r"] Apr 16 18:36:17.469829 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.469776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klvj2\" (UniqueName: \"kubernetes.io/projected/7952f3c3-c731-4614-8184-616a2f4249e5-kube-api-access-klvj2\") pod \"cert-manager-759f64656b-g2q6r\" (UID: \"7952f3c3-c731-4614-8184-616a2f4249e5\") " pod="cert-manager/cert-manager-759f64656b-g2q6r" Apr 16 18:36:17.469829 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.469824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7952f3c3-c731-4614-8184-616a2f4249e5-bound-sa-token\") pod \"cert-manager-759f64656b-g2q6r\" (UID: \"7952f3c3-c731-4614-8184-616a2f4249e5\") " pod="cert-manager/cert-manager-759f64656b-g2q6r" Apr 16 18:36:17.480592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.480567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-pclnl" Apr 16 18:36:17.570686 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.570592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klvj2\" (UniqueName: \"kubernetes.io/projected/7952f3c3-c731-4614-8184-616a2f4249e5-kube-api-access-klvj2\") pod \"cert-manager-759f64656b-g2q6r\" (UID: \"7952f3c3-c731-4614-8184-616a2f4249e5\") " pod="cert-manager/cert-manager-759f64656b-g2q6r" Apr 16 18:36:17.570686 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.570630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7952f3c3-c731-4614-8184-616a2f4249e5-bound-sa-token\") pod \"cert-manager-759f64656b-g2q6r\" (UID: \"7952f3c3-c731-4614-8184-616a2f4249e5\") " pod="cert-manager/cert-manager-759f64656b-g2q6r" Apr 16 18:36:17.588203 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.588174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7952f3c3-c731-4614-8184-616a2f4249e5-bound-sa-token\") pod \"cert-manager-759f64656b-g2q6r\" (UID: \"7952f3c3-c731-4614-8184-616a2f4249e5\") " pod="cert-manager/cert-manager-759f64656b-g2q6r" Apr 16 18:36:17.588460 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.588437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klvj2\" (UniqueName: \"kubernetes.io/projected/7952f3c3-c731-4614-8184-616a2f4249e5-kube-api-access-klvj2\") pod \"cert-manager-759f64656b-g2q6r\" (UID: \"7952f3c3-c731-4614-8184-616a2f4249e5\") " pod="cert-manager/cert-manager-759f64656b-g2q6r" Apr 16 18:36:17.619430 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.619389 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-g2q6r" Apr 16 18:36:17.760952 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.760927 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-g2q6r"] Apr 16 18:36:17.763476 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:36:17.763450 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7952f3c3_c731_4614_8184_616a2f4249e5.slice/crio-9d663cd9591b40adec22d604a29c9e1b134b0fc520d0001919342144296ec308 WatchSource:0}: Error finding container 9d663cd9591b40adec22d604a29c9e1b134b0fc520d0001919342144296ec308: Status 404 returned error can't find the container with id 9d663cd9591b40adec22d604a29c9e1b134b0fc520d0001919342144296ec308 Apr 16 18:36:17.765531 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:17.765512 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:36:18.499057 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:18.499024 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-g2q6r" event={"ID":"7952f3c3-c731-4614-8184-616a2f4249e5","Type":"ContainerStarted","Data":"1f688e5120350f9434ea8d5c6887ce2ff0ec6478c5fd5c552d9872deb01f1cd9"} Apr 16 18:36:18.499057 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:18.499059 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-g2q6r" event={"ID":"7952f3c3-c731-4614-8184-616a2f4249e5","Type":"ContainerStarted","Data":"9d663cd9591b40adec22d604a29c9e1b134b0fc520d0001919342144296ec308"} Apr 16 18:36:18.519907 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:18.519837 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-g2q6r" podStartSLOduration=1.519818834 podStartE2EDuration="1.519818834s" podCreationTimestamp="2026-04-16 18:36:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:36:18.517945499 +0000 UTC m=+304.642473176" watchObservedRunningTime="2026-04-16 18:36:18.519818834 +0000 UTC m=+304.644346496" Apr 16 18:36:20.532597 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:20.532563 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s"] Apr 16 18:36:20.587810 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:20.587735 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s"] Apr 16 18:36:20.587971 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:20.587889 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s" Apr 16 18:36:20.590813 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:20.590791 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-fm9vf\"" Apr 16 18:36:20.591917 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:20.591898 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:36:20.591983 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:20.591905 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 18:36:20.692685 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:20.692646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3102384-41f4-424e-969d-4e9dc666a9cf-tmp\") pod \"openshift-lws-operator-bfc7f696d-2fs8s\" (UID: \"a3102384-41f4-424e-969d-4e9dc666a9cf\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s" Apr 16 18:36:20.692875 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:20.692693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m828w\" (UniqueName: \"kubernetes.io/projected/a3102384-41f4-424e-969d-4e9dc666a9cf-kube-api-access-m828w\") pod \"openshift-lws-operator-bfc7f696d-2fs8s\" (UID: \"a3102384-41f4-424e-969d-4e9dc666a9cf\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s" Apr 16 18:36:20.793453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:20.793356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3102384-41f4-424e-969d-4e9dc666a9cf-tmp\") pod \"openshift-lws-operator-bfc7f696d-2fs8s\" (UID: \"a3102384-41f4-424e-969d-4e9dc666a9cf\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s" Apr 16 18:36:20.793453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:20.793396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m828w\" (UniqueName: \"kubernetes.io/projected/a3102384-41f4-424e-969d-4e9dc666a9cf-kube-api-access-m828w\") pod \"openshift-lws-operator-bfc7f696d-2fs8s\" (UID: \"a3102384-41f4-424e-969d-4e9dc666a9cf\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s" Apr 16 18:36:20.793836 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:20.793816 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3102384-41f4-424e-969d-4e9dc666a9cf-tmp\") pod \"openshift-lws-operator-bfc7f696d-2fs8s\" (UID: \"a3102384-41f4-424e-969d-4e9dc666a9cf\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s" Apr 16 18:36:20.802072 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:20.802039 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m828w\" (UniqueName: \"kubernetes.io/projected/a3102384-41f4-424e-969d-4e9dc666a9cf-kube-api-access-m828w\") pod \"openshift-lws-operator-bfc7f696d-2fs8s\" (UID: \"a3102384-41f4-424e-969d-4e9dc666a9cf\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s" Apr 16 18:36:20.897265 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:20.897222 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s" Apr 16 18:36:21.037674 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:21.037637 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s"] Apr 16 18:36:21.046355 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:36:21.046322 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3102384_41f4_424e_969d_4e9dc666a9cf.slice/crio-8b2f69fde72fae1276217e8284f4460b3d191fb705bffad807e728456788a839 WatchSource:0}: Error finding container 8b2f69fde72fae1276217e8284f4460b3d191fb705bffad807e728456788a839: Status 404 returned error can't find the container with id 8b2f69fde72fae1276217e8284f4460b3d191fb705bffad807e728456788a839 Apr 16 18:36:21.510256 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:21.510212 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s" event={"ID":"a3102384-41f4-424e-969d-4e9dc666a9cf","Type":"ContainerStarted","Data":"8b2f69fde72fae1276217e8284f4460b3d191fb705bffad807e728456788a839"} Apr 16 18:36:23.517891 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:23.517850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s" event={"ID":"a3102384-41f4-424e-969d-4e9dc666a9cf","Type":"ContainerStarted","Data":"02034a7207519afa4220813b5ff3b79d286c01896e0398d7bf87ab998f7e63fb"} Apr 16 18:36:23.537191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:23.537139 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2fs8s" podStartSLOduration=1.672959304 podStartE2EDuration="3.537123637s" podCreationTimestamp="2026-04-16 18:36:20 +0000 UTC" firstStartedPulling="2026-04-16 18:36:21.047777307 +0000 UTC m=+307.172304949" lastFinishedPulling="2026-04-16 18:36:22.911941638 +0000 UTC m=+309.036469282" observedRunningTime="2026-04-16 18:36:23.535253935 +0000 UTC m=+309.659781600" watchObservedRunningTime="2026-04-16 18:36:23.537123637 +0000 UTC m=+309.661651303" Apr 16 18:36:33.293376 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.293341 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q"] Apr 16 18:36:33.296602 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.296558 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" Apr 16 18:36:33.299231 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.299203 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:36:33.299376 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.299245 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:36:33.300237 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.300221 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-kpx8z\"" Apr 16 18:36:33.305062 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.305039 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q"] Apr 16 18:36:33.389345 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.389310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d5db2d0-9176-43a7-9a13-c8e46299b744-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q\" (UID: \"4d5db2d0-9176-43a7-9a13-c8e46299b744\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" Apr 16 18:36:33.389345 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.389352 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glc67\" (UniqueName: \"kubernetes.io/projected/4d5db2d0-9176-43a7-9a13-c8e46299b744-kube-api-access-glc67\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q\" (UID: \"4d5db2d0-9176-43a7-9a13-c8e46299b744\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" Apr 16 18:36:33.389565 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.389379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d5db2d0-9176-43a7-9a13-c8e46299b744-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q\" (UID: \"4d5db2d0-9176-43a7-9a13-c8e46299b744\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" Apr 16 18:36:33.490027 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.489992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d5db2d0-9176-43a7-9a13-c8e46299b744-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q\" (UID: \"4d5db2d0-9176-43a7-9a13-c8e46299b744\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" Apr 16 18:36:33.490027 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.490028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glc67\" (UniqueName: \"kubernetes.io/projected/4d5db2d0-9176-43a7-9a13-c8e46299b744-kube-api-access-glc67\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q\" (UID: \"4d5db2d0-9176-43a7-9a13-c8e46299b744\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" Apr 16 18:36:33.490284 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.490055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d5db2d0-9176-43a7-9a13-c8e46299b744-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q\" (UID: \"4d5db2d0-9176-43a7-9a13-c8e46299b744\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" Apr 16 18:36:33.490414 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.490395 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d5db2d0-9176-43a7-9a13-c8e46299b744-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q\" (UID: \"4d5db2d0-9176-43a7-9a13-c8e46299b744\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" Apr 16 18:36:33.490476 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.490454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d5db2d0-9176-43a7-9a13-c8e46299b744-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q\" (UID: \"4d5db2d0-9176-43a7-9a13-c8e46299b744\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" Apr 16 18:36:33.498198 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.498170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glc67\" (UniqueName: \"kubernetes.io/projected/4d5db2d0-9176-43a7-9a13-c8e46299b744-kube-api-access-glc67\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q\" (UID: \"4d5db2d0-9176-43a7-9a13-c8e46299b744\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" Apr 16 18:36:33.607276 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.607182 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" Apr 16 18:36:33.734822 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:33.734786 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q"] Apr 16 18:36:33.737462 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:36:33.737433 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d5db2d0_9176_43a7_9a13_c8e46299b744.slice/crio-0336046b7c956ad852a95b063a086878b9593672b3b6f9ebbeb742d190191a72 WatchSource:0}: Error finding container 0336046b7c956ad852a95b063a086878b9593672b3b6f9ebbeb742d190191a72: Status 404 returned error can't find the container with id 0336046b7c956ad852a95b063a086878b9593672b3b6f9ebbeb742d190191a72 Apr 16 18:36:34.552694 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:34.552648 2576 generic.go:358] "Generic (PLEG): container finished" podID="4d5db2d0-9176-43a7-9a13-c8e46299b744" containerID="1b6bb327785713da283f541b733143f8d03eb79e932962d9a19e024153dec09d" exitCode=0 Apr 16 18:36:34.553092 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:34.552731 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" event={"ID":"4d5db2d0-9176-43a7-9a13-c8e46299b744","Type":"ContainerDied","Data":"1b6bb327785713da283f541b733143f8d03eb79e932962d9a19e024153dec09d"} Apr 16 18:36:34.553092 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:34.552791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" event={"ID":"4d5db2d0-9176-43a7-9a13-c8e46299b744","Type":"ContainerStarted","Data":"0336046b7c956ad852a95b063a086878b9593672b3b6f9ebbeb742d190191a72"} Apr 16 18:36:35.557797 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:35.557686 2576 generic.go:358] "Generic (PLEG): container finished" podID="4d5db2d0-9176-43a7-9a13-c8e46299b744" containerID="def2404fd9882cbe89ac61d55c6deab1fa44670033ec039568bcbfe9e4344670" exitCode=0 Apr 16 18:36:35.558146 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:35.557792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" event={"ID":"4d5db2d0-9176-43a7-9a13-c8e46299b744","Type":"ContainerDied","Data":"def2404fd9882cbe89ac61d55c6deab1fa44670033ec039568bcbfe9e4344670"} Apr 16 18:36:36.566584 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:36.566549 2576 generic.go:358] "Generic (PLEG): container finished" podID="4d5db2d0-9176-43a7-9a13-c8e46299b744" containerID="27c92b51beb9b7754a800232e8da606ec05b3f7743257accf4d03e74831fc5ae" exitCode=0 Apr 16 18:36:36.566984 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:36.566595 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" event={"ID":"4d5db2d0-9176-43a7-9a13-c8e46299b744","Type":"ContainerDied","Data":"27c92b51beb9b7754a800232e8da606ec05b3f7743257accf4d03e74831fc5ae"} Apr 16 18:36:37.689511 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:37.689488 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" Apr 16 18:36:37.823056 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:37.822959 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d5db2d0-9176-43a7-9a13-c8e46299b744-bundle\") pod \"4d5db2d0-9176-43a7-9a13-c8e46299b744\" (UID: \"4d5db2d0-9176-43a7-9a13-c8e46299b744\") " Apr 16 18:36:37.823056 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:37.823035 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d5db2d0-9176-43a7-9a13-c8e46299b744-util\") pod \"4d5db2d0-9176-43a7-9a13-c8e46299b744\" (UID: \"4d5db2d0-9176-43a7-9a13-c8e46299b744\") " Apr 16 18:36:37.823277 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:37.823065 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glc67\" (UniqueName: \"kubernetes.io/projected/4d5db2d0-9176-43a7-9a13-c8e46299b744-kube-api-access-glc67\") pod \"4d5db2d0-9176-43a7-9a13-c8e46299b744\" (UID: \"4d5db2d0-9176-43a7-9a13-c8e46299b744\") " Apr 16 18:36:37.823845 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:37.823822 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d5db2d0-9176-43a7-9a13-c8e46299b744-bundle" (OuterVolumeSpecName: "bundle") pod "4d5db2d0-9176-43a7-9a13-c8e46299b744" (UID: "4d5db2d0-9176-43a7-9a13-c8e46299b744"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:36:37.825153 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:37.825131 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5db2d0-9176-43a7-9a13-c8e46299b744-kube-api-access-glc67" (OuterVolumeSpecName: "kube-api-access-glc67") pod "4d5db2d0-9176-43a7-9a13-c8e46299b744" (UID: "4d5db2d0-9176-43a7-9a13-c8e46299b744"). InnerVolumeSpecName "kube-api-access-glc67". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:36:37.828520 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:37.828494 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d5db2d0-9176-43a7-9a13-c8e46299b744-util" (OuterVolumeSpecName: "util") pod "4d5db2d0-9176-43a7-9a13-c8e46299b744" (UID: "4d5db2d0-9176-43a7-9a13-c8e46299b744"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:36:37.923906 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:37.923872 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d5db2d0-9176-43a7-9a13-c8e46299b744-bundle\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:36:37.923906 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:37.923900 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d5db2d0-9176-43a7-9a13-c8e46299b744-util\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:36:37.923906 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:37.923912 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-glc67\" (UniqueName: \"kubernetes.io/projected/4d5db2d0-9176-43a7-9a13-c8e46299b744-kube-api-access-glc67\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:36:38.574342 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:38.574309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" event={"ID":"4d5db2d0-9176-43a7-9a13-c8e46299b744","Type":"ContainerDied","Data":"0336046b7c956ad852a95b063a086878b9593672b3b6f9ebbeb742d190191a72"} Apr 16 18:36:38.574342 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:38.574347 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0336046b7c956ad852a95b063a086878b9593672b3b6f9ebbeb742d190191a72" Apr 16 18:36:38.574545 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:38.574317 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352qp9q" Apr 16 18:36:48.056726 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.056682 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs"] Apr 16 18:36:48.057319 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.057113 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d5db2d0-9176-43a7-9a13-c8e46299b744" containerName="util" Apr 16 18:36:48.057319 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.057130 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5db2d0-9176-43a7-9a13-c8e46299b744" containerName="util" Apr 16 18:36:48.057319 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.057149 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d5db2d0-9176-43a7-9a13-c8e46299b744" containerName="extract" Apr 16 18:36:48.057319 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.057156 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5db2d0-9176-43a7-9a13-c8e46299b744" containerName="extract" Apr 16 18:36:48.057319 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.057181 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d5db2d0-9176-43a7-9a13-c8e46299b744" containerName="pull" Apr 16 18:36:48.057319 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.057189 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5db2d0-9176-43a7-9a13-c8e46299b744" containerName="pull" Apr 16 18:36:48.057319 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.057249 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d5db2d0-9176-43a7-9a13-c8e46299b744" containerName="extract" Apr 16 18:36:48.061486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.061463 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" Apr 16 18:36:48.080305 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.080274 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:36:48.080681 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.080303 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:36:48.080823 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.080447 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-kpx8z\"" Apr 16 18:36:48.082323 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.082297 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs"] Apr 16 18:36:48.095630 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.095584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95795c59-25e5-4169-a69d-cc4a65fd6619-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs\" (UID: \"95795c59-25e5-4169-a69d-cc4a65fd6619\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" Apr 16 18:36:48.095819 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.095681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95795c59-25e5-4169-a69d-cc4a65fd6619-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs\" (UID: \"95795c59-25e5-4169-a69d-cc4a65fd6619\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" Apr 16 18:36:48.095819 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.095725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrhz8\" (UniqueName: \"kubernetes.io/projected/95795c59-25e5-4169-a69d-cc4a65fd6619-kube-api-access-vrhz8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs\" (UID: \"95795c59-25e5-4169-a69d-cc4a65fd6619\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" Apr 16 18:36:48.196417 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.196372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95795c59-25e5-4169-a69d-cc4a65fd6619-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs\" (UID: \"95795c59-25e5-4169-a69d-cc4a65fd6619\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" Apr 16 18:36:48.196614 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.196434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrhz8\" (UniqueName: \"kubernetes.io/projected/95795c59-25e5-4169-a69d-cc4a65fd6619-kube-api-access-vrhz8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs\" (UID: \"95795c59-25e5-4169-a69d-cc4a65fd6619\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" Apr 16 18:36:48.196614 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.196480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95795c59-25e5-4169-a69d-cc4a65fd6619-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs\" (UID: \"95795c59-25e5-4169-a69d-cc4a65fd6619\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" Apr 16 18:36:48.196766 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.196731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95795c59-25e5-4169-a69d-cc4a65fd6619-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs\" (UID: \"95795c59-25e5-4169-a69d-cc4a65fd6619\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" Apr 16 18:36:48.196879 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.196861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95795c59-25e5-4169-a69d-cc4a65fd6619-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs\" (UID: \"95795c59-25e5-4169-a69d-cc4a65fd6619\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" Apr 16 18:36:48.218057 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.218020 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrhz8\" (UniqueName: \"kubernetes.io/projected/95795c59-25e5-4169-a69d-cc4a65fd6619-kube-api-access-vrhz8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs\" (UID: \"95795c59-25e5-4169-a69d-cc4a65fd6619\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" Apr 16 18:36:48.370865 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.370777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" Apr 16 18:36:48.511073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.511043 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs"] Apr 16 18:36:48.515595 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:36:48.515567 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95795c59_25e5_4169_a69d_cc4a65fd6619.slice/crio-64d30d54f6f1d79ec0f12784de9d64c53afed98a7f3b628c5576ba08ffcddcfc WatchSource:0}: Error finding container 64d30d54f6f1d79ec0f12784de9d64c53afed98a7f3b628c5576ba08ffcddcfc: Status 404 returned error can't find the container with id 64d30d54f6f1d79ec0f12784de9d64c53afed98a7f3b628c5576ba08ffcddcfc Apr 16 18:36:48.607959 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.607919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" event={"ID":"95795c59-25e5-4169-a69d-cc4a65fd6619","Type":"ContainerStarted","Data":"a74ed02854748b5e4012784a7fa34799b68de059f38dbfc4260df844d216d525"} Apr 16 18:36:48.608150 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:48.607968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" event={"ID":"95795c59-25e5-4169-a69d-cc4a65fd6619","Type":"ContainerStarted","Data":"64d30d54f6f1d79ec0f12784de9d64c53afed98a7f3b628c5576ba08ffcddcfc"} Apr 16 18:36:49.612423 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:49.612337 2576 generic.go:358] "Generic (PLEG): container finished" podID="95795c59-25e5-4169-a69d-cc4a65fd6619" containerID="a74ed02854748b5e4012784a7fa34799b68de059f38dbfc4260df844d216d525" exitCode=0 Apr 16 18:36:49.612858 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:49.612416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" event={"ID":"95795c59-25e5-4169-a69d-cc4a65fd6619","Type":"ContainerDied","Data":"a74ed02854748b5e4012784a7fa34799b68de059f38dbfc4260df844d216d525"} Apr 16 18:36:50.506658 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:50.506627 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7"] Apr 16 18:36:50.509644 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:50.509625 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7" Apr 16 18:36:50.515682 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:50.515649 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-lqc7m\"" Apr 16 18:36:50.515893 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:50.515769 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 18:36:50.515893 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:50.515787 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 18:36:50.527214 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:50.527179 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7"] Apr 16 18:36:50.613082 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:50.613042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjzmk\" (UniqueName: \"kubernetes.io/projected/f2d32c7b-bc27-44a8-9390-5214268d376a-kube-api-access-rjzmk\") pod \"servicemesh-operator3-55f49c5f94-tlzc7\" (UID: \"f2d32c7b-bc27-44a8-9390-5214268d376a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7" Apr 16 18:36:50.613475 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:50.613128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f2d32c7b-bc27-44a8-9390-5214268d376a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-tlzc7\" (UID: \"f2d32c7b-bc27-44a8-9390-5214268d376a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7" Apr 16 18:36:50.713693 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:50.713659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjzmk\" (UniqueName: \"kubernetes.io/projected/f2d32c7b-bc27-44a8-9390-5214268d376a-kube-api-access-rjzmk\") pod \"servicemesh-operator3-55f49c5f94-tlzc7\" (UID: \"f2d32c7b-bc27-44a8-9390-5214268d376a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7" Apr 16 18:36:50.713880 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:50.713718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f2d32c7b-bc27-44a8-9390-5214268d376a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-tlzc7\" (UID: \"f2d32c7b-bc27-44a8-9390-5214268d376a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7" Apr 16 18:36:50.716122 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:50.716103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f2d32c7b-bc27-44a8-9390-5214268d376a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-tlzc7\" (UID: \"f2d32c7b-bc27-44a8-9390-5214268d376a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7" Apr 16 18:36:50.723015 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:50.722986 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjzmk\" (UniqueName: \"kubernetes.io/projected/f2d32c7b-bc27-44a8-9390-5214268d376a-kube-api-access-rjzmk\") pod \"servicemesh-operator3-55f49c5f94-tlzc7\" (UID: \"f2d32c7b-bc27-44a8-9390-5214268d376a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7" Apr 16 18:36:50.818915 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:50.818841 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7" Apr 16 18:36:51.021970 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:51.021785 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7"] Apr 16 18:36:51.025209 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:36:51.025182 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2d32c7b_bc27_44a8_9390_5214268d376a.slice/crio-0236504cf349fda13900aaa9c51bdceaee30136d30a82cf60efc357b4668ab10 WatchSource:0}: Error finding container 0236504cf349fda13900aaa9c51bdceaee30136d30a82cf60efc357b4668ab10: Status 404 returned error can't find the container with id 0236504cf349fda13900aaa9c51bdceaee30136d30a82cf60efc357b4668ab10 Apr 16 18:36:51.620761 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:51.620708 2576 generic.go:358] "Generic (PLEG): container finished" podID="95795c59-25e5-4169-a69d-cc4a65fd6619" containerID="2b78d91e759ca4f2ddcb91e6af9c197dc52101516bfa11b678749047e2c1b0e0" exitCode=0 Apr 16 18:36:51.621214 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:51.620789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" event={"ID":"95795c59-25e5-4169-a69d-cc4a65fd6619","Type":"ContainerDied","Data":"2b78d91e759ca4f2ddcb91e6af9c197dc52101516bfa11b678749047e2c1b0e0"} Apr 16 18:36:51.622148 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:51.622120 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7" event={"ID":"f2d32c7b-bc27-44a8-9390-5214268d376a","Type":"ContainerStarted","Data":"0236504cf349fda13900aaa9c51bdceaee30136d30a82cf60efc357b4668ab10"} Apr 16 18:36:52.627880 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:52.627839 2576 generic.go:358] "Generic (PLEG): container finished" podID="95795c59-25e5-4169-a69d-cc4a65fd6619" containerID="ca2f4549a8a75a7ed710d5e6e840c34c42c81b77bd6ea3da40d7733662b82ce4" exitCode=0 Apr 16 18:36:52.628343 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:52.627936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" event={"ID":"95795c59-25e5-4169-a69d-cc4a65fd6619","Type":"ContainerDied","Data":"ca2f4549a8a75a7ed710d5e6e840c34c42c81b77bd6ea3da40d7733662b82ce4"} Apr 16 18:36:53.949362 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:53.949338 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" Apr 16 18:36:54.039146 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.039113 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95795c59-25e5-4169-a69d-cc4a65fd6619-bundle\") pod \"95795c59-25e5-4169-a69d-cc4a65fd6619\" (UID: \"95795c59-25e5-4169-a69d-cc4a65fd6619\") " Apr 16 18:36:54.039309 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.039187 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95795c59-25e5-4169-a69d-cc4a65fd6619-util\") pod \"95795c59-25e5-4169-a69d-cc4a65fd6619\" (UID: \"95795c59-25e5-4169-a69d-cc4a65fd6619\") " Apr 16 18:36:54.039309 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.039257 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrhz8\" (UniqueName: \"kubernetes.io/projected/95795c59-25e5-4169-a69d-cc4a65fd6619-kube-api-access-vrhz8\") pod \"95795c59-25e5-4169-a69d-cc4a65fd6619\" (UID: \"95795c59-25e5-4169-a69d-cc4a65fd6619\") " Apr 16 18:36:54.040052 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.040026 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95795c59-25e5-4169-a69d-cc4a65fd6619-bundle" (OuterVolumeSpecName: "bundle") pod "95795c59-25e5-4169-a69d-cc4a65fd6619" (UID: "95795c59-25e5-4169-a69d-cc4a65fd6619"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:36:54.041283 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.041245 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95795c59-25e5-4169-a69d-cc4a65fd6619-kube-api-access-vrhz8" (OuterVolumeSpecName: "kube-api-access-vrhz8") pod "95795c59-25e5-4169-a69d-cc4a65fd6619" (UID: "95795c59-25e5-4169-a69d-cc4a65fd6619"). InnerVolumeSpecName "kube-api-access-vrhz8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:36:54.140833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.140761 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vrhz8\" (UniqueName: \"kubernetes.io/projected/95795c59-25e5-4169-a69d-cc4a65fd6619-kube-api-access-vrhz8\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:36:54.140833 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.140804 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95795c59-25e5-4169-a69d-cc4a65fd6619-bundle\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:36:54.493089 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.493049 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95795c59-25e5-4169-a69d-cc4a65fd6619-util" (OuterVolumeSpecName: "util") pod "95795c59-25e5-4169-a69d-cc4a65fd6619" (UID: "95795c59-25e5-4169-a69d-cc4a65fd6619"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:36:54.543589 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.543551 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95795c59-25e5-4169-a69d-cc4a65fd6619-util\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:36:54.636983 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.636947 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" event={"ID":"95795c59-25e5-4169-a69d-cc4a65fd6619","Type":"ContainerDied","Data":"64d30d54f6f1d79ec0f12784de9d64c53afed98a7f3b628c5576ba08ffcddcfc"} Apr 16 18:36:54.636983 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.636979 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64d30d54f6f1d79ec0f12784de9d64c53afed98a7f3b628c5576ba08ffcddcfc" Apr 16 18:36:54.636983 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.636983 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2qhdzs" Apr 16 18:36:54.638758 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.638666 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7" event={"ID":"f2d32c7b-bc27-44a8-9390-5214268d376a","Type":"ContainerStarted","Data":"d454e71430b19112f7ac308b7c231d1390104936d7e290db2ba6fdbabcffa03d"} Apr 16 18:36:54.638915 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.638810 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7" Apr 16 18:36:54.676385 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:36:54.676333 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7" podStartSLOduration=1.7255719360000001 podStartE2EDuration="4.676315411s" podCreationTimestamp="2026-04-16 18:36:50 +0000 UTC" firstStartedPulling="2026-04-16 18:36:51.028067447 +0000 UTC m=+337.152595092" lastFinishedPulling="2026-04-16 18:36:53.978810912 +0000 UTC m=+340.103338567" observedRunningTime="2026-04-16 18:36:54.673186623 +0000 UTC m=+340.797714288" watchObservedRunningTime="2026-04-16 18:36:54.676315411 +0000 UTC m=+340.800843071" Apr 16 18:37:04.712791 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.712687 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn"] Apr 16 18:37:04.713163 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.712968 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95795c59-25e5-4169-a69d-cc4a65fd6619" containerName="pull" Apr 16 18:37:04.713163 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.712979 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="95795c59-25e5-4169-a69d-cc4a65fd6619" containerName="pull" Apr 16 18:37:04.713163 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.712991 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95795c59-25e5-4169-a69d-cc4a65fd6619" containerName="util" Apr 16 18:37:04.713163 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.712996 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="95795c59-25e5-4169-a69d-cc4a65fd6619" containerName="util" Apr 16 18:37:04.713163 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.713008 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95795c59-25e5-4169-a69d-cc4a65fd6619" containerName="extract" Apr 16 18:37:04.713163 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.713014 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="95795c59-25e5-4169-a69d-cc4a65fd6619" containerName="extract" Apr 16 18:37:04.713163 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.713059 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="95795c59-25e5-4169-a69d-cc4a65fd6619" containerName="extract" Apr 16 18:37:04.715118 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.715101 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.719261 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.719223 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 18:37:04.719419 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.719334 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:37:04.719419 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.719368 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 18:37:04.719535 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.719458 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 18:37:04.720605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.720583 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-djg2l\"" Apr 16 18:37:04.720767 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.720623 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:37:04.720767 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.720583 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 18:37:04.733018 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.732985 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn"] Apr 16 18:37:04.814876 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.814847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv6pl\" (UniqueName: \"kubernetes.io/projected/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-kube-api-access-zv6pl\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.815093 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.814886 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.815093 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.814910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.815093 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.814975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.815093 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.815040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.815093 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.815060 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.815093 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.815074 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.915415 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.915357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zv6pl\" (UniqueName: \"kubernetes.io/projected/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-kube-api-access-zv6pl\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.915415 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.915408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.915811 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.915429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.915811 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.915473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.915811 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.915528 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.915811 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.915555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.915811 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.915582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.916309 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.916284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.918509 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.918483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.918618 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.918485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.918688 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.918661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.918688 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.918675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.924157 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.924123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:04.924280 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:04.924220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv6pl\" (UniqueName: \"kubernetes.io/projected/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-kube-api-access-zv6pl\") pod \"istiod-openshift-gateway-7cd77c7ffd-87vdn\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:05.024928 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:05.024839 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:05.167722 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:05.167680 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn"] Apr 16 18:37:05.169981 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:37:05.169948 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ec70e0b_ff33_4453_b5eb_a3a49a9c6aa6.slice/crio-e8b0c33f4c0af4632509304b49538151e66a91e408d55466ed480a27c7096880 WatchSource:0}: Error finding container e8b0c33f4c0af4632509304b49538151e66a91e408d55466ed480a27c7096880: Status 404 returned error can't find the container with id e8b0c33f4c0af4632509304b49538151e66a91e408d55466ed480a27c7096880 Apr 16 18:37:05.645707 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:05.645639 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tlzc7" Apr 16 18:37:05.678723 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:05.678687 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" event={"ID":"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6","Type":"ContainerStarted","Data":"e8b0c33f4c0af4632509304b49538151e66a91e408d55466ed480a27c7096880"} Apr 16 18:37:07.584561 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:07.584531 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:37:07.584829 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:07.584604 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:37:07.687468 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:07.687419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" event={"ID":"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6","Type":"ContainerStarted","Data":"dd4dece8118e846c81dbe9c98e047117c8adfc7184f3091ebbb8e4a83ba54d1a"} Apr 16 18:37:07.687662 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:07.687551 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:08.693231 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:08.693206 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:37:08.716379 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:08.716326 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" podStartSLOduration=2.30384459 podStartE2EDuration="4.716309496s" podCreationTimestamp="2026-04-16 18:37:04 +0000 UTC" firstStartedPulling="2026-04-16 18:37:05.171843831 +0000 UTC m=+351.296371473" lastFinishedPulling="2026-04-16 18:37:07.584308738 +0000 UTC m=+353.708836379" observedRunningTime="2026-04-16 18:37:07.716448965 +0000 UTC m=+353.840976639" watchObservedRunningTime="2026-04-16 18:37:08.716309496 +0000 UTC m=+354.840837159" Apr 16 18:37:10.998933 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:10.998892 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb"] Apr 16 18:37:11.001449 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.001431 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.004484 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.004460 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-9h5fv\"" Apr 16 18:37:11.021243 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.021208 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb"] Apr 16 18:37:11.077283 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.077253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/406e9ecb-57cc-43df-aaa2-16fd037884da-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.077486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.077288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vfs4\" (UniqueName: \"kubernetes.io/projected/406e9ecb-57cc-43df-aaa2-16fd037884da-kube-api-access-7vfs4\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.077486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.077393 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.077486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.077446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/406e9ecb-57cc-43df-aaa2-16fd037884da-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.077486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.077477 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.077664 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.077502 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/406e9ecb-57cc-43df-aaa2-16fd037884da-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.077664 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.077540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.077664 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.077598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.077664 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.077641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.178781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.178723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/406e9ecb-57cc-43df-aaa2-16fd037884da-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.178781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.178781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.179032 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.178809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/406e9ecb-57cc-43df-aaa2-16fd037884da-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.179032 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.178836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.179032 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.178876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.179032 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.178915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.179032 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.178953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/406e9ecb-57cc-43df-aaa2-16fd037884da-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.179032 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.178977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vfs4\" (UniqueName: \"kubernetes.io/projected/406e9ecb-57cc-43df-aaa2-16fd037884da-kube-api-access-7vfs4\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.179280 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.179036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.179335 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.179297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.179335 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.179322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.179442 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.179401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.179523 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.179496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.179645 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.179623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/406e9ecb-57cc-43df-aaa2-16fd037884da-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.181199 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.181174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/406e9ecb-57cc-43df-aaa2-16fd037884da-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.181526 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.181507 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/406e9ecb-57cc-43df-aaa2-16fd037884da-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.187734 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.187710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/406e9ecb-57cc-43df-aaa2-16fd037884da-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.188013 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.187991 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vfs4\" (UniqueName: \"kubernetes.io/projected/406e9ecb-57cc-43df-aaa2-16fd037884da-kube-api-access-7vfs4\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-fhvkb\" (UID: \"406e9ecb-57cc-43df-aaa2-16fd037884da\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.314424 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.314324 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:11.451241 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.451216 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb"] Apr 16 18:37:11.453973 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:37:11.453942 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod406e9ecb_57cc_43df_aaa2_16fd037884da.slice/crio-9cb2432e36cc417871e349a8d28d780790d813a6151e321f25573d08b2432ebc WatchSource:0}: Error finding container 9cb2432e36cc417871e349a8d28d780790d813a6151e321f25573d08b2432ebc: Status 404 returned error can't find the container with id 9cb2432e36cc417871e349a8d28d780790d813a6151e321f25573d08b2432ebc Apr 16 18:37:11.702467 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:11.702436 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" event={"ID":"406e9ecb-57cc-43df-aaa2-16fd037884da","Type":"ContainerStarted","Data":"9cb2432e36cc417871e349a8d28d780790d813a6151e321f25573d08b2432ebc"} Apr 16 18:37:13.685355 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:13.685321 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:37:13.685662 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:13.685405 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:37:13.685662 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:13.685439 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:37:14.713936 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:14.713898 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" event={"ID":"406e9ecb-57cc-43df-aaa2-16fd037884da","Type":"ContainerStarted","Data":"10ae6167180443ae8d2cd6273b0d5a7b7775ec8604aac36681e12c4ab3273edf"} Apr 16 18:37:14.741222 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:14.741119 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" podStartSLOduration=2.51238344 podStartE2EDuration="4.741103231s" podCreationTimestamp="2026-04-16 18:37:10 +0000 UTC" firstStartedPulling="2026-04-16 18:37:11.456365315 +0000 UTC m=+357.580892956" lastFinishedPulling="2026-04-16 18:37:13.685085107 +0000 UTC m=+359.809612747" observedRunningTime="2026-04-16 18:37:14.739785766 +0000 UTC m=+360.864313468" watchObservedRunningTime="2026-04-16 18:37:14.741103231 +0000 UTC m=+360.865630894" Apr 16 18:37:15.314630 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:15.314587 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:15.319331 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:15.319304 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:15.717349 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:15.717301 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:15.718421 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:15.718402 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-fhvkb" Apr 16 18:37:20.070185 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.070149 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46"] Apr 16 18:37:20.073274 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.073255 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" Apr 16 18:37:20.075969 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.075946 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-kpx8z\"" Apr 16 18:37:20.076102 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.075967 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:37:20.077077 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.077056 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:37:20.086023 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.085988 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46"] Apr 16 18:37:20.159148 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.159105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46\" (UID: \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" Apr 16 18:37:20.159327 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.159190 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46\" (UID: \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" Apr 16 18:37:20.159327 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.159242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc4nl\" (UniqueName: \"kubernetes.io/projected/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-kube-api-access-xc4nl\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46\" (UID: \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" Apr 16 18:37:20.181265 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.181229 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql"] Apr 16 18:37:20.183498 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.183482 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" Apr 16 18:37:20.193636 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.193607 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql"] Apr 16 18:37:20.260442 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.260407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc4nl\" (UniqueName: \"kubernetes.io/projected/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-kube-api-access-xc4nl\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46\" (UID: \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" Apr 16 18:37:20.260614 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.260453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8lcd\" (UniqueName: \"kubernetes.io/projected/1c32b6f5-868e-4ebb-b788-94950f8b8a85-kube-api-access-d8lcd\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql\" (UID: \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" Apr 16 18:37:20.260614 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.260504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c32b6f5-868e-4ebb-b788-94950f8b8a85-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql\" (UID: \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" Apr 16 18:37:20.260614 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.260530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c32b6f5-868e-4ebb-b788-94950f8b8a85-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql\" (UID: \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" Apr 16 18:37:20.260614 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.260558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46\" (UID: \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" Apr 16 18:37:20.260851 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.260663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46\" (UID: \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" Apr 16 18:37:20.260989 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.260974 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46\" (UID: \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" Apr 16 18:37:20.261034 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.261015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46\" (UID: \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" Apr 16 18:37:20.269839 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.269813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc4nl\" (UniqueName: \"kubernetes.io/projected/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-kube-api-access-xc4nl\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46\" (UID: \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" Apr 16 18:37:20.272730 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.272703 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg"] Apr 16 18:37:20.275469 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.275450 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" Apr 16 18:37:20.285436 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.285410 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg"] Apr 16 18:37:20.361325 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.361225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c32b6f5-868e-4ebb-b788-94950f8b8a85-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql\" (UID: \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" Apr 16 18:37:20.361325 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.361266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c32b6f5-868e-4ebb-b788-94950f8b8a85-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql\" (UID: \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" Apr 16 18:37:20.361325 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.361300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4480696-6297-4c24-8cd4-f87074079f87-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg\" (UID: \"d4480696-6297-4c24-8cd4-f87074079f87\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" Apr 16 18:37:20.361590 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.361344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8sh2\" (UniqueName: \"kubernetes.io/projected/d4480696-6297-4c24-8cd4-f87074079f87-kube-api-access-p8sh2\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg\" (UID: \"d4480696-6297-4c24-8cd4-f87074079f87\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" Apr 16 18:37:20.361590 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.361376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4480696-6297-4c24-8cd4-f87074079f87-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg\" (UID: \"d4480696-6297-4c24-8cd4-f87074079f87\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" Apr 16 18:37:20.361590 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.361504 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8lcd\" (UniqueName: \"kubernetes.io/projected/1c32b6f5-868e-4ebb-b788-94950f8b8a85-kube-api-access-d8lcd\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql\" (UID: \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" Apr 16 18:37:20.361696 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.361681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c32b6f5-868e-4ebb-b788-94950f8b8a85-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql\" (UID: \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" Apr 16 18:37:20.361811 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.361707 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c32b6f5-868e-4ebb-b788-94950f8b8a85-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql\" (UID: \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" Apr 16 18:37:20.372798 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.372755 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46"] Apr 16 18:37:20.375388 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.375368 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" Apr 16 18:37:20.375910 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.375886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8lcd\" (UniqueName: \"kubernetes.io/projected/1c32b6f5-868e-4ebb-b788-94950f8b8a85-kube-api-access-d8lcd\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql\" (UID: \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" Apr 16 18:37:20.381719 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.381701 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" Apr 16 18:37:20.389688 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.389657 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46"] Apr 16 18:37:20.462236 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.462163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdxrh\" (UniqueName: \"kubernetes.io/projected/7619f875-db70-4dd2-97cf-bafbc56c1292-kube-api-access-xdxrh\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46\" (UID: \"7619f875-db70-4dd2-97cf-bafbc56c1292\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" Apr 16 18:37:20.462236 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.462216 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8sh2\" (UniqueName: \"kubernetes.io/projected/d4480696-6297-4c24-8cd4-f87074079f87-kube-api-access-p8sh2\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg\" (UID: \"d4480696-6297-4c24-8cd4-f87074079f87\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" Apr 16 18:37:20.462404 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.462310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4480696-6297-4c24-8cd4-f87074079f87-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg\" (UID: \"d4480696-6297-4c24-8cd4-f87074079f87\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" Apr 16 18:37:20.462404 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.462382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7619f875-db70-4dd2-97cf-bafbc56c1292-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46\" (UID: \"7619f875-db70-4dd2-97cf-bafbc56c1292\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" Apr 16 18:37:20.462497 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.462410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7619f875-db70-4dd2-97cf-bafbc56c1292-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46\" (UID: \"7619f875-db70-4dd2-97cf-bafbc56c1292\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" Apr 16 18:37:20.462497 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.462478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4480696-6297-4c24-8cd4-f87074079f87-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg\" (UID: \"d4480696-6297-4c24-8cd4-f87074079f87\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" Apr 16 18:37:20.462822 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.462797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4480696-6297-4c24-8cd4-f87074079f87-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg\" (UID: \"d4480696-6297-4c24-8cd4-f87074079f87\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" Apr 16 18:37:20.463086 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.462959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4480696-6297-4c24-8cd4-f87074079f87-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg\" (UID: \"d4480696-6297-4c24-8cd4-f87074079f87\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" Apr 16 18:37:20.475003 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.474969 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8sh2\" (UniqueName: \"kubernetes.io/projected/d4480696-6297-4c24-8cd4-f87074079f87-kube-api-access-p8sh2\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg\" (UID: \"d4480696-6297-4c24-8cd4-f87074079f87\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" Apr 16 18:37:20.493533 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.493503 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" Apr 16 18:37:20.512152 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.512128 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46"] Apr 16 18:37:20.514029 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:37:20.514005 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc713e6d4_7c7c_4393_acab_1fcb409e0bc4.slice/crio-19a4b4cea97abd855bb6be752f3b64f6754d54c5dbe03fe53e794556a751bab5 WatchSource:0}: Error finding container 19a4b4cea97abd855bb6be752f3b64f6754d54c5dbe03fe53e794556a751bab5: Status 404 returned error can't find the container with id 19a4b4cea97abd855bb6be752f3b64f6754d54c5dbe03fe53e794556a751bab5 Apr 16 18:37:20.563205 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.563032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdxrh\" (UniqueName: \"kubernetes.io/projected/7619f875-db70-4dd2-97cf-bafbc56c1292-kube-api-access-xdxrh\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46\" (UID: \"7619f875-db70-4dd2-97cf-bafbc56c1292\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" Apr 16 18:37:20.563205 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.563105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7619f875-db70-4dd2-97cf-bafbc56c1292-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46\" (UID: \"7619f875-db70-4dd2-97cf-bafbc56c1292\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" Apr 16 18:37:20.563205 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.563141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7619f875-db70-4dd2-97cf-bafbc56c1292-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46\" (UID: \"7619f875-db70-4dd2-97cf-bafbc56c1292\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" Apr 16 18:37:20.563658 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.563633 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7619f875-db70-4dd2-97cf-bafbc56c1292-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46\" (UID: \"7619f875-db70-4dd2-97cf-bafbc56c1292\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" Apr 16 18:37:20.563658 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.563653 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7619f875-db70-4dd2-97cf-bafbc56c1292-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46\" (UID: \"7619f875-db70-4dd2-97cf-bafbc56c1292\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" Apr 16 18:37:20.572529 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.572489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdxrh\" (UniqueName: \"kubernetes.io/projected/7619f875-db70-4dd2-97cf-bafbc56c1292-kube-api-access-xdxrh\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46\" (UID: \"7619f875-db70-4dd2-97cf-bafbc56c1292\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" Apr 16 18:37:20.585311 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.585252 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" Apr 16 18:37:20.630427 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.630212 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql"] Apr 16 18:37:20.653803 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:37:20.653736 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c32b6f5_868e_4ebb_b788_94950f8b8a85.slice/crio-e324ab27e407cd242104356903c9be22d3b96d72665559726578794e0c77e8ef WatchSource:0}: Error finding container e324ab27e407cd242104356903c9be22d3b96d72665559726578794e0c77e8ef: Status 404 returned error can't find the container with id e324ab27e407cd242104356903c9be22d3b96d72665559726578794e0c77e8ef Apr 16 18:37:20.704807 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.704781 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" Apr 16 18:37:20.724991 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.724965 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg"] Apr 16 18:37:20.726502 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:37:20.726462 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4480696_6297_4c24_8cd4_f87074079f87.slice/crio-ab24326c45811ded69df319b9845baa92afa79edfcbc49f52e8641398a9ddb51 WatchSource:0}: Error finding container ab24326c45811ded69df319b9845baa92afa79edfcbc49f52e8641398a9ddb51: Status 404 returned error can't find the container with id ab24326c45811ded69df319b9845baa92afa79edfcbc49f52e8641398a9ddb51 Apr 16 18:37:20.736519 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.736478 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" event={"ID":"d4480696-6297-4c24-8cd4-f87074079f87","Type":"ContainerStarted","Data":"ab24326c45811ded69df319b9845baa92afa79edfcbc49f52e8641398a9ddb51"} Apr 16 18:37:20.738079 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.738000 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" event={"ID":"1c32b6f5-868e-4ebb-b788-94950f8b8a85","Type":"ContainerStarted","Data":"0a9f57c9f93b08debb30a46cb6607d06d8294e23b0bb6335adb03619f228322c"} Apr 16 18:37:20.738079 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.738052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" event={"ID":"1c32b6f5-868e-4ebb-b788-94950f8b8a85","Type":"ContainerStarted","Data":"e324ab27e407cd242104356903c9be22d3b96d72665559726578794e0c77e8ef"} Apr 16 18:37:20.739519 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.739487 2576 generic.go:358] "Generic (PLEG): container finished" podID="c713e6d4-7c7c-4393-acab-1fcb409e0bc4" containerID="8851de2da3d2c96d954304d4d6b9ad4453b1566848e67fcab5061e4982ef156a" exitCode=0 Apr 16 18:37:20.739635 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.739571 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" event={"ID":"c713e6d4-7c7c-4393-acab-1fcb409e0bc4","Type":"ContainerDied","Data":"8851de2da3d2c96d954304d4d6b9ad4453b1566848e67fcab5061e4982ef156a"} Apr 16 18:37:20.739635 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.739615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" event={"ID":"c713e6d4-7c7c-4393-acab-1fcb409e0bc4","Type":"ContainerStarted","Data":"19a4b4cea97abd855bb6be752f3b64f6754d54c5dbe03fe53e794556a751bab5"} Apr 16 18:37:20.842237 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:20.842097 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46"] Apr 16 18:37:20.844811 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:37:20.844782 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7619f875_db70_4dd2_97cf_bafbc56c1292.slice/crio-53099e5f5acd1f2b763d9048638081f18dfe7f62edf31339ad28945239b802bf WatchSource:0}: Error finding container 53099e5f5acd1f2b763d9048638081f18dfe7f62edf31339ad28945239b802bf: Status 404 returned error can't find the container with id 53099e5f5acd1f2b763d9048638081f18dfe7f62edf31339ad28945239b802bf Apr 16 18:37:21.744129 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:21.744093 2576 generic.go:358] "Generic (PLEG): container finished" podID="d4480696-6297-4c24-8cd4-f87074079f87" containerID="d97a58a1840b385cbfe6f54ee8925af4a8827ac27cb1427490faa9bfc04652a9" exitCode=0 Apr 16 18:37:21.744588 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:21.744181 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" event={"ID":"d4480696-6297-4c24-8cd4-f87074079f87","Type":"ContainerDied","Data":"d97a58a1840b385cbfe6f54ee8925af4a8827ac27cb1427490faa9bfc04652a9"} Apr 16 18:37:21.745641 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:21.745618 2576 generic.go:358] "Generic (PLEG): container finished" podID="1c32b6f5-868e-4ebb-b788-94950f8b8a85" containerID="0a9f57c9f93b08debb30a46cb6607d06d8294e23b0bb6335adb03619f228322c" exitCode=0 Apr 16 18:37:21.745702 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:21.745686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" event={"ID":"1c32b6f5-868e-4ebb-b788-94950f8b8a85","Type":"ContainerDied","Data":"0a9f57c9f93b08debb30a46cb6607d06d8294e23b0bb6335adb03619f228322c"} Apr 16 18:37:21.747361 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:21.747337 2576 generic.go:358] "Generic (PLEG): container finished" podID="c713e6d4-7c7c-4393-acab-1fcb409e0bc4" containerID="eb88f013890d92cbae6d5aae7a653f9357519d08375772956537f38cc16c802b" exitCode=0 Apr 16 18:37:21.747442 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:21.747419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" event={"ID":"c713e6d4-7c7c-4393-acab-1fcb409e0bc4","Type":"ContainerDied","Data":"eb88f013890d92cbae6d5aae7a653f9357519d08375772956537f38cc16c802b"} Apr 16 18:37:21.748986 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:21.748959 2576 generic.go:358] "Generic (PLEG): container finished" podID="7619f875-db70-4dd2-97cf-bafbc56c1292" containerID="d0a8df4227f5f6fa828e0611e6d09014903b757fcbb9870fb057bf40b0d666f3" exitCode=0 Apr 16 18:37:21.749131 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:21.748991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" event={"ID":"7619f875-db70-4dd2-97cf-bafbc56c1292","Type":"ContainerDied","Data":"d0a8df4227f5f6fa828e0611e6d09014903b757fcbb9870fb057bf40b0d666f3"} Apr 16 18:37:21.749131 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:21.749025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" event={"ID":"7619f875-db70-4dd2-97cf-bafbc56c1292","Type":"ContainerStarted","Data":"53099e5f5acd1f2b763d9048638081f18dfe7f62edf31339ad28945239b802bf"} Apr 16 18:37:22.754038 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:22.753949 2576 generic.go:358] "Generic (PLEG): container finished" podID="1c32b6f5-868e-4ebb-b788-94950f8b8a85" containerID="91e3d6e9f70a8c8c50fd690d1119f0f912900705ebdaa50401b1ac8ed4160ed8" exitCode=0 Apr 16 18:37:22.754455 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:22.754037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" event={"ID":"1c32b6f5-868e-4ebb-b788-94950f8b8a85","Type":"ContainerDied","Data":"91e3d6e9f70a8c8c50fd690d1119f0f912900705ebdaa50401b1ac8ed4160ed8"} Apr 16 18:37:22.755966 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:22.755945 2576 generic.go:358] "Generic (PLEG): container finished" podID="c713e6d4-7c7c-4393-acab-1fcb409e0bc4" containerID="39e032e339204db75e17900ae4302081ae2edae9c4863b2f91cbf68f05d4fc2d" exitCode=0 Apr 16 18:37:22.756054 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:22.756026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" event={"ID":"c713e6d4-7c7c-4393-acab-1fcb409e0bc4","Type":"ContainerDied","Data":"39e032e339204db75e17900ae4302081ae2edae9c4863b2f91cbf68f05d4fc2d"} Apr 16 18:37:22.757551 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:22.757531 2576 generic.go:358] "Generic (PLEG): container finished" podID="7619f875-db70-4dd2-97cf-bafbc56c1292" containerID="1ac7e3afc415e22717a812089eee03c47c1215c554d06b219e7489671867cf20" exitCode=0 Apr 16 18:37:22.757654 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:22.757636 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" event={"ID":"7619f875-db70-4dd2-97cf-bafbc56c1292","Type":"ContainerDied","Data":"1ac7e3afc415e22717a812089eee03c47c1215c554d06b219e7489671867cf20"} Apr 16 18:37:22.759351 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:22.759329 2576 generic.go:358] "Generic (PLEG): container finished" podID="d4480696-6297-4c24-8cd4-f87074079f87" containerID="9a3495dc0fa31c109c521d82775c06e6c8da0bc124d89095e2a72a0231e05129" exitCode=0 Apr 16 18:37:22.759454 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:22.759361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" event={"ID":"d4480696-6297-4c24-8cd4-f87074079f87","Type":"ContainerDied","Data":"9a3495dc0fa31c109c521d82775c06e6c8da0bc124d89095e2a72a0231e05129"} Apr 16 18:37:23.765491 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:23.765455 2576 generic.go:358] "Generic (PLEG): container finished" podID="7619f875-db70-4dd2-97cf-bafbc56c1292" containerID="37820fb84b6899f04943ed0521419d002fedc0eee22cce9ce1644183ce805744" exitCode=0 Apr 16 18:37:23.765875 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:23.765539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" event={"ID":"7619f875-db70-4dd2-97cf-bafbc56c1292","Type":"ContainerDied","Data":"37820fb84b6899f04943ed0521419d002fedc0eee22cce9ce1644183ce805744"} Apr 16 18:37:23.767426 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:23.767402 2576 generic.go:358] "Generic (PLEG): container finished" podID="d4480696-6297-4c24-8cd4-f87074079f87" containerID="aa3dcf1cc768a11975de34e36d565c665d149bad42206aa98e5426f1cb6bf6b2" exitCode=0 Apr 16 18:37:23.767527 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:23.767484 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" event={"ID":"d4480696-6297-4c24-8cd4-f87074079f87","Type":"ContainerDied","Data":"aa3dcf1cc768a11975de34e36d565c665d149bad42206aa98e5426f1cb6bf6b2"} Apr 16 18:37:23.769208 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:23.769188 2576 generic.go:358] "Generic (PLEG): container finished" podID="1c32b6f5-868e-4ebb-b788-94950f8b8a85" containerID="afe808f01c6c2caf5738b01febbafc8fca2c8cda6c3e617483cc4fd630ab5507" exitCode=0 Apr 16 18:37:23.769310 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:23.769264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" event={"ID":"1c32b6f5-868e-4ebb-b788-94950f8b8a85","Type":"ContainerDied","Data":"afe808f01c6c2caf5738b01febbafc8fca2c8cda6c3e617483cc4fd630ab5507"} Apr 16 18:37:23.896752 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:23.896720 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" Apr 16 18:37:23.991889 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:23.991854 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-bundle\") pod \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\" (UID: \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\") " Apr 16 18:37:23.992084 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:23.991949 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-util\") pod \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\" (UID: \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\") " Apr 16 18:37:23.992084 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:23.992023 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc4nl\" (UniqueName: \"kubernetes.io/projected/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-kube-api-access-xc4nl\") pod \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\" (UID: \"c713e6d4-7c7c-4393-acab-1fcb409e0bc4\") " Apr 16 18:37:23.992464 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:23.992439 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-bundle" (OuterVolumeSpecName: "bundle") pod "c713e6d4-7c7c-4393-acab-1fcb409e0bc4" (UID: "c713e6d4-7c7c-4393-acab-1fcb409e0bc4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:23.994119 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:23.994089 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-kube-api-access-xc4nl" (OuterVolumeSpecName: "kube-api-access-xc4nl") pod "c713e6d4-7c7c-4393-acab-1fcb409e0bc4" (UID: "c713e6d4-7c7c-4393-acab-1fcb409e0bc4"). InnerVolumeSpecName "kube-api-access-xc4nl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:37:23.999204 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:23.999177 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-util" (OuterVolumeSpecName: "util") pod "c713e6d4-7c7c-4393-acab-1fcb409e0bc4" (UID: "c713e6d4-7c7c-4393-acab-1fcb409e0bc4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:24.093641 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:24.093555 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-util\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:37:24.093641 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:24.093586 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xc4nl\" (UniqueName: \"kubernetes.io/projected/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-kube-api-access-xc4nl\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:37:24.093641 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:24.093597 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c713e6d4-7c7c-4393-acab-1fcb409e0bc4-bundle\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:37:24.774371 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:24.774277 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" Apr 16 18:37:24.774371 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:24.774292 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c309ss46" event={"ID":"c713e6d4-7c7c-4393-acab-1fcb409e0bc4","Type":"ContainerDied","Data":"19a4b4cea97abd855bb6be752f3b64f6754d54c5dbe03fe53e794556a751bab5"} Apr 16 18:37:24.774371 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:24.774343 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19a4b4cea97abd855bb6be752f3b64f6754d54c5dbe03fe53e794556a751bab5" Apr 16 18:37:24.904861 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:24.904835 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" Apr 16 18:37:24.961975 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:24.961945 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" Apr 16 18:37:24.964876 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:24.964855 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" Apr 16 18:37:25.000785 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.000730 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4480696-6297-4c24-8cd4-f87074079f87-bundle\") pod \"d4480696-6297-4c24-8cd4-f87074079f87\" (UID: \"d4480696-6297-4c24-8cd4-f87074079f87\") " Apr 16 18:37:25.000972 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.000847 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4480696-6297-4c24-8cd4-f87074079f87-util\") pod \"d4480696-6297-4c24-8cd4-f87074079f87\" (UID: \"d4480696-6297-4c24-8cd4-f87074079f87\") " Apr 16 18:37:25.000972 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.000917 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8sh2\" (UniqueName: \"kubernetes.io/projected/d4480696-6297-4c24-8cd4-f87074079f87-kube-api-access-p8sh2\") pod \"d4480696-6297-4c24-8cd4-f87074079f87\" (UID: \"d4480696-6297-4c24-8cd4-f87074079f87\") " Apr 16 18:37:25.001345 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.001318 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4480696-6297-4c24-8cd4-f87074079f87-bundle" (OuterVolumeSpecName: "bundle") pod "d4480696-6297-4c24-8cd4-f87074079f87" (UID: "d4480696-6297-4c24-8cd4-f87074079f87"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:25.003665 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.003231 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4480696-6297-4c24-8cd4-f87074079f87-kube-api-access-p8sh2" (OuterVolumeSpecName: "kube-api-access-p8sh2") pod "d4480696-6297-4c24-8cd4-f87074079f87" (UID: "d4480696-6297-4c24-8cd4-f87074079f87"). InnerVolumeSpecName "kube-api-access-p8sh2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:37:25.007228 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.007201 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4480696-6297-4c24-8cd4-f87074079f87-util" (OuterVolumeSpecName: "util") pod "d4480696-6297-4c24-8cd4-f87074079f87" (UID: "d4480696-6297-4c24-8cd4-f87074079f87"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:25.102009 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.101911 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdxrh\" (UniqueName: \"kubernetes.io/projected/7619f875-db70-4dd2-97cf-bafbc56c1292-kube-api-access-xdxrh\") pod \"7619f875-db70-4dd2-97cf-bafbc56c1292\" (UID: \"7619f875-db70-4dd2-97cf-bafbc56c1292\") " Apr 16 18:37:25.102009 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.101957 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c32b6f5-868e-4ebb-b788-94950f8b8a85-util\") pod \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\" (UID: \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\") " Apr 16 18:37:25.102009 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.101993 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c32b6f5-868e-4ebb-b788-94950f8b8a85-bundle\") pod \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\" (UID: \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\") " Apr 16 18:37:25.102295 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.102022 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7619f875-db70-4dd2-97cf-bafbc56c1292-bundle\") pod \"7619f875-db70-4dd2-97cf-bafbc56c1292\" (UID: \"7619f875-db70-4dd2-97cf-bafbc56c1292\") " Apr 16 18:37:25.102295 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.102060 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8lcd\" (UniqueName: \"kubernetes.io/projected/1c32b6f5-868e-4ebb-b788-94950f8b8a85-kube-api-access-d8lcd\") pod \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\" (UID: \"1c32b6f5-868e-4ebb-b788-94950f8b8a85\") " Apr 16 18:37:25.102295 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.102090 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7619f875-db70-4dd2-97cf-bafbc56c1292-util\") pod \"7619f875-db70-4dd2-97cf-bafbc56c1292\" (UID: \"7619f875-db70-4dd2-97cf-bafbc56c1292\") " Apr 16 18:37:25.102295 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.102286 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4480696-6297-4c24-8cd4-f87074079f87-util\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:37:25.102490 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.102305 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p8sh2\" (UniqueName: \"kubernetes.io/projected/d4480696-6297-4c24-8cd4-f87074079f87-kube-api-access-p8sh2\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:37:25.102490 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.102322 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4480696-6297-4c24-8cd4-f87074079f87-bundle\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:37:25.102582 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.102526 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7619f875-db70-4dd2-97cf-bafbc56c1292-bundle" (OuterVolumeSpecName: "bundle") pod "7619f875-db70-4dd2-97cf-bafbc56c1292" (UID: "7619f875-db70-4dd2-97cf-bafbc56c1292"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:25.102640 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.102583 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c32b6f5-868e-4ebb-b788-94950f8b8a85-bundle" (OuterVolumeSpecName: "bundle") pod "1c32b6f5-868e-4ebb-b788-94950f8b8a85" (UID: "1c32b6f5-868e-4ebb-b788-94950f8b8a85"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:25.104167 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.104134 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7619f875-db70-4dd2-97cf-bafbc56c1292-kube-api-access-xdxrh" (OuterVolumeSpecName: "kube-api-access-xdxrh") pod "7619f875-db70-4dd2-97cf-bafbc56c1292" (UID: "7619f875-db70-4dd2-97cf-bafbc56c1292"). InnerVolumeSpecName "kube-api-access-xdxrh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:37:25.104283 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.104224 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c32b6f5-868e-4ebb-b788-94950f8b8a85-kube-api-access-d8lcd" (OuterVolumeSpecName: "kube-api-access-d8lcd") pod "1c32b6f5-868e-4ebb-b788-94950f8b8a85" (UID: "1c32b6f5-868e-4ebb-b788-94950f8b8a85"). InnerVolumeSpecName "kube-api-access-d8lcd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:37:25.107781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.107752 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c32b6f5-868e-4ebb-b788-94950f8b8a85-util" (OuterVolumeSpecName: "util") pod "1c32b6f5-868e-4ebb-b788-94950f8b8a85" (UID: "1c32b6f5-868e-4ebb-b788-94950f8b8a85"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:25.107907 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.107884 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7619f875-db70-4dd2-97cf-bafbc56c1292-util" (OuterVolumeSpecName: "util") pod "7619f875-db70-4dd2-97cf-bafbc56c1292" (UID: "7619f875-db70-4dd2-97cf-bafbc56c1292"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:25.203386 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.203338 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xdxrh\" (UniqueName: \"kubernetes.io/projected/7619f875-db70-4dd2-97cf-bafbc56c1292-kube-api-access-xdxrh\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:37:25.203386 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.203376 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c32b6f5-868e-4ebb-b788-94950f8b8a85-util\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:37:25.203386 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.203386 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c32b6f5-868e-4ebb-b788-94950f8b8a85-bundle\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:37:25.203386 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.203395 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7619f875-db70-4dd2-97cf-bafbc56c1292-bundle\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:37:25.203650 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.203404 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d8lcd\" (UniqueName: \"kubernetes.io/projected/1c32b6f5-868e-4ebb-b788-94950f8b8a85-kube-api-access-d8lcd\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:37:25.203650 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.203415 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7619f875-db70-4dd2-97cf-bafbc56c1292-util\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:37:25.780446 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.780414 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" event={"ID":"1c32b6f5-868e-4ebb-b788-94950f8b8a85","Type":"ContainerDied","Data":"e324ab27e407cd242104356903c9be22d3b96d72665559726578794e0c77e8ef"} Apr 16 18:37:25.780446 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.780445 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e324ab27e407cd242104356903c9be22d3b96d72665559726578794e0c77e8ef" Apr 16 18:37:25.780942 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.780451 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec884xcql" Apr 16 18:37:25.782417 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.782384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" event={"ID":"7619f875-db70-4dd2-97cf-bafbc56c1292","Type":"ContainerDied","Data":"53099e5f5acd1f2b763d9048638081f18dfe7f62edf31339ad28945239b802bf"} Apr 16 18:37:25.782581 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.782426 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53099e5f5acd1f2b763d9048638081f18dfe7f62edf31339ad28945239b802bf" Apr 16 18:37:25.782581 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.782399 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767brsw46" Apr 16 18:37:25.784135 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.784106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" event={"ID":"d4480696-6297-4c24-8cd4-f87074079f87","Type":"ContainerDied","Data":"ab24326c45811ded69df319b9845baa92afa79edfcbc49f52e8641398a9ddb51"} Apr 16 18:37:25.784267 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.784137 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab24326c45811ded69df319b9845baa92afa79edfcbc49f52e8641398a9ddb51" Apr 16 18:37:25.784267 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:25.784155 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503577lg" Apr 16 18:37:30.651063 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651031 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-fqp2f"] Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651324 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c32b6f5-868e-4ebb-b788-94950f8b8a85" containerName="util" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651336 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c32b6f5-868e-4ebb-b788-94950f8b8a85" containerName="util" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651344 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c713e6d4-7c7c-4393-acab-1fcb409e0bc4" containerName="extract" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651350 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c713e6d4-7c7c-4393-acab-1fcb409e0bc4" containerName="extract" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651358 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7619f875-db70-4dd2-97cf-bafbc56c1292" containerName="extract" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651364 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7619f875-db70-4dd2-97cf-bafbc56c1292" containerName="extract" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651373 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4480696-6297-4c24-8cd4-f87074079f87" containerName="util" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651377 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4480696-6297-4c24-8cd4-f87074079f87" containerName="util" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651387 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7619f875-db70-4dd2-97cf-bafbc56c1292" containerName="util" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651399 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7619f875-db70-4dd2-97cf-bafbc56c1292" containerName="util" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651406 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c32b6f5-868e-4ebb-b788-94950f8b8a85" containerName="extract" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651410 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c32b6f5-868e-4ebb-b788-94950f8b8a85" containerName="extract" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651418 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c713e6d4-7c7c-4393-acab-1fcb409e0bc4" containerName="util" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651423 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c713e6d4-7c7c-4393-acab-1fcb409e0bc4" containerName="util" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651431 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c713e6d4-7c7c-4393-acab-1fcb409e0bc4" containerName="pull" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651435 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c713e6d4-7c7c-4393-acab-1fcb409e0bc4" containerName="pull" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651441 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4480696-6297-4c24-8cd4-f87074079f87" containerName="extract" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651445 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4480696-6297-4c24-8cd4-f87074079f87" containerName="extract" Apr 16 18:37:30.651445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651453 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c32b6f5-868e-4ebb-b788-94950f8b8a85" containerName="pull" Apr 16 18:37:30.652044 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651457 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c32b6f5-868e-4ebb-b788-94950f8b8a85" containerName="pull" Apr 16 18:37:30.652044 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651463 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7619f875-db70-4dd2-97cf-bafbc56c1292" containerName="pull" Apr 16 18:37:30.652044 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651467 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7619f875-db70-4dd2-97cf-bafbc56c1292" containerName="pull" Apr 16 18:37:30.652044 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651472 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4480696-6297-4c24-8cd4-f87074079f87" containerName="pull" Apr 16 18:37:30.652044 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651477 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4480696-6297-4c24-8cd4-f87074079f87" containerName="pull" Apr 16 18:37:30.652044 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651529 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c32b6f5-868e-4ebb-b788-94950f8b8a85" containerName="extract" Apr 16 18:37:30.652044 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651538 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c713e6d4-7c7c-4393-acab-1fcb409e0bc4" containerName="extract" Apr 16 18:37:30.652044 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651547 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7619f875-db70-4dd2-97cf-bafbc56c1292" containerName="extract" Apr 16 18:37:30.652044 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.651553 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4480696-6297-4c24-8cd4-f87074079f87" containerName="extract" Apr 16 18:37:30.660899 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.660868 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-fqp2f" Apr 16 18:37:30.666570 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.666545 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 18:37:30.666720 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.666602 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-nxsc5\"" Apr 16 18:37:30.668052 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.668033 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 18:37:30.678276 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.678245 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-fqp2f"] Apr 16 18:37:30.745824 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.745790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhwkt\" (UniqueName: \"kubernetes.io/projected/757f884b-57ab-45d5-8be4-13b354df0096-kube-api-access-lhwkt\") pod \"authorino-operator-7587b89b76-fqp2f\" (UID: \"757f884b-57ab-45d5-8be4-13b354df0096\") " pod="kuadrant-system/authorino-operator-7587b89b76-fqp2f" Apr 16 18:37:30.847015 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.846983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhwkt\" (UniqueName: \"kubernetes.io/projected/757f884b-57ab-45d5-8be4-13b354df0096-kube-api-access-lhwkt\") pod \"authorino-operator-7587b89b76-fqp2f\" (UID: \"757f884b-57ab-45d5-8be4-13b354df0096\") " pod="kuadrant-system/authorino-operator-7587b89b76-fqp2f" Apr 16 18:37:30.865282 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.865246 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhwkt\" (UniqueName: \"kubernetes.io/projected/757f884b-57ab-45d5-8be4-13b354df0096-kube-api-access-lhwkt\") pod \"authorino-operator-7587b89b76-fqp2f\" (UID: \"757f884b-57ab-45d5-8be4-13b354df0096\") " pod="kuadrant-system/authorino-operator-7587b89b76-fqp2f" Apr 16 18:37:30.970981 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:30.970945 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-fqp2f" Apr 16 18:37:31.130326 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:31.130269 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-fqp2f"] Apr 16 18:37:31.132369 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:37:31.132341 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod757f884b_57ab_45d5_8be4_13b354df0096.slice/crio-0efe589db70c01e51fb713343c82db3c9e2d48567809e6976e721f697aaa7ded WatchSource:0}: Error finding container 0efe589db70c01e51fb713343c82db3c9e2d48567809e6976e721f697aaa7ded: Status 404 returned error can't find the container with id 0efe589db70c01e51fb713343c82db3c9e2d48567809e6976e721f697aaa7ded Apr 16 18:37:31.807781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:31.807725 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-fqp2f" event={"ID":"757f884b-57ab-45d5-8be4-13b354df0096","Type":"ContainerStarted","Data":"0efe589db70c01e51fb713343c82db3c9e2d48567809e6976e721f697aaa7ded"} Apr 16 18:37:33.816399 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:33.816366 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-fqp2f" event={"ID":"757f884b-57ab-45d5-8be4-13b354df0096","Type":"ContainerStarted","Data":"298677ad3788f151c9122a55df7c83db4cba204869439cfa8225b2ba57e48c00"} Apr 16 18:37:33.816903 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:33.816418 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-fqp2f" Apr 16 18:37:33.836705 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:33.836654 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-fqp2f" podStartSLOduration=1.7917055309999999 podStartE2EDuration="3.836636744s" podCreationTimestamp="2026-04-16 18:37:30 +0000 UTC" firstStartedPulling="2026-04-16 18:37:31.134557678 +0000 UTC m=+377.259085319" lastFinishedPulling="2026-04-16 18:37:33.179488875 +0000 UTC m=+379.304016532" observedRunningTime="2026-04-16 18:37:33.836575658 +0000 UTC m=+379.961103317" watchObservedRunningTime="2026-04-16 18:37:33.836636744 +0000 UTC m=+379.961164411" Apr 16 18:37:36.422320 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:36.422284 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-pvggb"] Apr 16 18:37:36.425591 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:36.425575 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pvggb" Apr 16 18:37:36.429069 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:36.429048 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-t9zpt\"" Apr 16 18:37:36.429157 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:36.429049 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 18:37:36.439312 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:36.439278 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-pvggb"] Apr 16 18:37:36.598607 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:36.598556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdwvc\" (UniqueName: \"kubernetes.io/projected/065cdea5-5e54-4008-b182-d1a8e64b8540-kube-api-access-wdwvc\") pod \"dns-operator-controller-manager-844548ff4c-pvggb\" (UID: \"065cdea5-5e54-4008-b182-d1a8e64b8540\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pvggb" Apr 16 18:37:36.699525 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:36.699491 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdwvc\" (UniqueName: \"kubernetes.io/projected/065cdea5-5e54-4008-b182-d1a8e64b8540-kube-api-access-wdwvc\") pod \"dns-operator-controller-manager-844548ff4c-pvggb\" (UID: \"065cdea5-5e54-4008-b182-d1a8e64b8540\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pvggb" Apr 16 18:37:36.710665 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:36.710642 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdwvc\" (UniqueName: \"kubernetes.io/projected/065cdea5-5e54-4008-b182-d1a8e64b8540-kube-api-access-wdwvc\") pod \"dns-operator-controller-manager-844548ff4c-pvggb\" (UID: \"065cdea5-5e54-4008-b182-d1a8e64b8540\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pvggb" Apr 16 18:37:36.735635 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:36.735602 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pvggb" Apr 16 18:37:36.869140 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:36.869116 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-pvggb"] Apr 16 18:37:36.871020 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:37:36.870980 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod065cdea5_5e54_4008_b182_d1a8e64b8540.slice/crio-ea0dffb8667d30e9727ec6e51fb4e550a1d20654d27d76e6456ca62f872bc9ee WatchSource:0}: Error finding container ea0dffb8667d30e9727ec6e51fb4e550a1d20654d27d76e6456ca62f872bc9ee: Status 404 returned error can't find the container with id ea0dffb8667d30e9727ec6e51fb4e550a1d20654d27d76e6456ca62f872bc9ee Apr 16 18:37:37.834103 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:37.834066 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pvggb" event={"ID":"065cdea5-5e54-4008-b182-d1a8e64b8540","Type":"ContainerStarted","Data":"ea0dffb8667d30e9727ec6e51fb4e550a1d20654d27d76e6456ca62f872bc9ee"} Apr 16 18:37:38.729289 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:38.729257 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-q9cqp"] Apr 16 18:37:38.732582 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:38.732559 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-q9cqp" Apr 16 18:37:38.736259 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:38.736236 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-6j5q6\"" Apr 16 18:37:38.744849 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:38.744825 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-q9cqp"] Apr 16 18:37:38.838586 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:38.838552 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pvggb" event={"ID":"065cdea5-5e54-4008-b182-d1a8e64b8540","Type":"ContainerStarted","Data":"e749a4bf85847bc9f0ab2912d27dc4a338f6ca909a2fd9040385a41a2fc93424"} Apr 16 18:37:38.838987 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:38.838600 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pvggb" Apr 16 18:37:38.864940 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:38.864878 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pvggb" podStartSLOduration=1.5321054840000001 podStartE2EDuration="2.86486128s" podCreationTimestamp="2026-04-16 18:37:36 +0000 UTC" firstStartedPulling="2026-04-16 18:37:36.873081887 +0000 UTC m=+382.997609528" lastFinishedPulling="2026-04-16 18:37:38.205837683 +0000 UTC m=+384.330365324" observedRunningTime="2026-04-16 18:37:38.861527667 +0000 UTC m=+384.986055345" watchObservedRunningTime="2026-04-16 18:37:38.86486128 +0000 UTC m=+384.989388943" Apr 16 18:37:38.917863 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:38.917829 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvscx\" (UniqueName: \"kubernetes.io/projected/0a59c445-4607-4fc4-ab27-7a8903901fd4-kube-api-access-fvscx\") pod \"limitador-operator-controller-manager-c7fb4c8d5-q9cqp\" (UID: \"0a59c445-4607-4fc4-ab27-7a8903901fd4\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-q9cqp" Apr 16 18:37:39.019248 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:39.019142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvscx\" (UniqueName: \"kubernetes.io/projected/0a59c445-4607-4fc4-ab27-7a8903901fd4-kube-api-access-fvscx\") pod \"limitador-operator-controller-manager-c7fb4c8d5-q9cqp\" (UID: \"0a59c445-4607-4fc4-ab27-7a8903901fd4\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-q9cqp" Apr 16 18:37:39.036686 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:39.036649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvscx\" (UniqueName: \"kubernetes.io/projected/0a59c445-4607-4fc4-ab27-7a8903901fd4-kube-api-access-fvscx\") pod \"limitador-operator-controller-manager-c7fb4c8d5-q9cqp\" (UID: \"0a59c445-4607-4fc4-ab27-7a8903901fd4\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-q9cqp" Apr 16 18:37:39.042428 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:39.042391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-q9cqp" Apr 16 18:37:39.173600 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:39.173572 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-q9cqp"] Apr 16 18:37:39.175327 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:37:39.175290 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a59c445_4607_4fc4_ab27_7a8903901fd4.slice/crio-93786bdba5040b3ffa4ace859ef6ef5a07b9b2b30606b1f78bb5465202c98e9e WatchSource:0}: Error finding container 93786bdba5040b3ffa4ace859ef6ef5a07b9b2b30606b1f78bb5465202c98e9e: Status 404 returned error can't find the container with id 93786bdba5040b3ffa4ace859ef6ef5a07b9b2b30606b1f78bb5465202c98e9e Apr 16 18:37:39.844504 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:39.844465 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-q9cqp" event={"ID":"0a59c445-4607-4fc4-ab27-7a8903901fd4","Type":"ContainerStarted","Data":"93786bdba5040b3ffa4ace859ef6ef5a07b9b2b30606b1f78bb5465202c98e9e"} Apr 16 18:37:40.849102 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:40.849016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-q9cqp" event={"ID":"0a59c445-4607-4fc4-ab27-7a8903901fd4","Type":"ContainerStarted","Data":"dc64f2508770b40a6f918c8fb13665b2b2627bfeee19b7038b80f3f5e3b3b462"} Apr 16 18:37:40.849450 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:40.849129 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-q9cqp" Apr 16 18:37:40.869143 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:40.869086 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-q9cqp" podStartSLOduration=1.609391153 podStartE2EDuration="2.869069692s" podCreationTimestamp="2026-04-16 18:37:38 +0000 UTC" firstStartedPulling="2026-04-16 18:37:39.177380886 +0000 UTC m=+385.301908527" lastFinishedPulling="2026-04-16 18:37:40.43705941 +0000 UTC m=+386.561587066" observedRunningTime="2026-04-16 18:37:40.868036521 +0000 UTC m=+386.992564186" watchObservedRunningTime="2026-04-16 18:37:40.869069692 +0000 UTC m=+386.993597354" Apr 16 18:37:44.822644 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:44.822612 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-fqp2f" Apr 16 18:37:49.846520 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:49.846491 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pvggb" Apr 16 18:37:51.854778 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:37:51.854732 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-q9cqp" Apr 16 18:38:24.242476 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.242438 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-84cp9"] Apr 16 18:38:24.247442 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.247424 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-84cp9" Apr 16 18:38:24.250019 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.249993 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 18:38:24.250162 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.250057 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-6sw6v\"" Apr 16 18:38:24.254252 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.254225 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-84cp9"] Apr 16 18:38:24.283148 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.283112 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-84cp9"] Apr 16 18:38:24.389389 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.389356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/af2ed6e9-5dfb-40b4-a482-f542057abd67-config-file\") pod \"limitador-limitador-67566c68b4-84cp9\" (UID: \"af2ed6e9-5dfb-40b4-a482-f542057abd67\") " pod="kuadrant-system/limitador-limitador-67566c68b4-84cp9" Apr 16 18:38:24.389589 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.389414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krf5v\" (UniqueName: \"kubernetes.io/projected/af2ed6e9-5dfb-40b4-a482-f542057abd67-kube-api-access-krf5v\") pod \"limitador-limitador-67566c68b4-84cp9\" (UID: \"af2ed6e9-5dfb-40b4-a482-f542057abd67\") " pod="kuadrant-system/limitador-limitador-67566c68b4-84cp9" Apr 16 18:38:24.490270 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.490232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/af2ed6e9-5dfb-40b4-a482-f542057abd67-config-file\") pod \"limitador-limitador-67566c68b4-84cp9\" (UID: \"af2ed6e9-5dfb-40b4-a482-f542057abd67\") " pod="kuadrant-system/limitador-limitador-67566c68b4-84cp9" Apr 16 18:38:24.490460 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.490284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krf5v\" (UniqueName: \"kubernetes.io/projected/af2ed6e9-5dfb-40b4-a482-f542057abd67-kube-api-access-krf5v\") pod \"limitador-limitador-67566c68b4-84cp9\" (UID: \"af2ed6e9-5dfb-40b4-a482-f542057abd67\") " pod="kuadrant-system/limitador-limitador-67566c68b4-84cp9" Apr 16 18:38:24.490879 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.490859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/af2ed6e9-5dfb-40b4-a482-f542057abd67-config-file\") pod \"limitador-limitador-67566c68b4-84cp9\" (UID: \"af2ed6e9-5dfb-40b4-a482-f542057abd67\") " pod="kuadrant-system/limitador-limitador-67566c68b4-84cp9" Apr 16 18:38:24.500550 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.500467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krf5v\" (UniqueName: \"kubernetes.io/projected/af2ed6e9-5dfb-40b4-a482-f542057abd67-kube-api-access-krf5v\") pod \"limitador-limitador-67566c68b4-84cp9\" (UID: \"af2ed6e9-5dfb-40b4-a482-f542057abd67\") " pod="kuadrant-system/limitador-limitador-67566c68b4-84cp9" Apr 16 18:38:24.558733 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.558697 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-84cp9" Apr 16 18:38:24.709034 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:24.709006 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-84cp9"] Apr 16 18:38:24.711129 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:38:24.711084 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf2ed6e9_5dfb_40b4_a482_f542057abd67.slice/crio-a4ec2a12b839de64fa18653faeac9d358fef45c389566f9396662223ee48d0bf WatchSource:0}: Error finding container a4ec2a12b839de64fa18653faeac9d358fef45c389566f9396662223ee48d0bf: Status 404 returned error can't find the container with id a4ec2a12b839de64fa18653faeac9d358fef45c389566f9396662223ee48d0bf Apr 16 18:38:25.019217 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:25.019185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-84cp9" event={"ID":"af2ed6e9-5dfb-40b4-a482-f542057abd67","Type":"ContainerStarted","Data":"a4ec2a12b839de64fa18653faeac9d358fef45c389566f9396662223ee48d0bf"} Apr 16 18:38:29.037514 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:29.037477 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-84cp9" event={"ID":"af2ed6e9-5dfb-40b4-a482-f542057abd67","Type":"ContainerStarted","Data":"475b78cc4978332caf72d8cf921ec2866dde2d729de60234235eb93ea3245af4"} Apr 16 18:38:29.037909 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:29.037589 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-84cp9" Apr 16 18:38:29.057084 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:29.057032 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-84cp9" podStartSLOduration=1.469500684 podStartE2EDuration="5.057014221s" podCreationTimestamp="2026-04-16 18:38:24 +0000 UTC" firstStartedPulling="2026-04-16 18:38:24.713366486 +0000 UTC m=+430.837894128" lastFinishedPulling="2026-04-16 18:38:28.300880021 +0000 UTC m=+434.425407665" observedRunningTime="2026-04-16 18:38:29.055320255 +0000 UTC m=+435.179847919" watchObservedRunningTime="2026-04-16 18:38:29.057014221 +0000 UTC m=+435.181541884" Apr 16 18:38:40.042149 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:38:40.042058 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-84cp9" Apr 16 18:39:04.276900 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.276860 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn"] Apr 16 18:39:04.278950 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.278716 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" podUID="5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6" containerName="discovery" containerID="cri-o://dd4dece8118e846c81dbe9c98e047117c8adfc7184f3091ebbb8e4a83ba54d1a" gracePeriod=30 Apr 16 18:39:04.528318 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.528242 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:39:04.619548 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.619515 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-token\") pod \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " Apr 16 18:39:04.619548 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.619564 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-kubeconfig\") pod \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " Apr 16 18:39:04.619825 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.619586 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-csr-dns-cert\") pod \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " Apr 16 18:39:04.619825 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.619619 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv6pl\" (UniqueName: \"kubernetes.io/projected/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-kube-api-access-zv6pl\") pod \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " Apr 16 18:39:04.619825 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.619684 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-csr-ca-configmap\") pod \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " Apr 16 18:39:04.619825 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.619711 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-cacerts\") pod \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " Apr 16 18:39:04.619825 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.619768 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-local-certs\") pod \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\" (UID: \"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6\") " Apr 16 18:39:04.620201 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.620134 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6" (UID: "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:39:04.622521 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.622439 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-cacerts" (OuterVolumeSpecName: "cacerts") pod "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6" (UID: "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:39:04.622521 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.622461 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-kube-api-access-zv6pl" (OuterVolumeSpecName: "kube-api-access-zv6pl") pod "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6" (UID: "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6"). InnerVolumeSpecName "kube-api-access-zv6pl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:39:04.622521 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.622478 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-token" (OuterVolumeSpecName: "istio-token") pod "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6" (UID: "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:39:04.622779 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.622536 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-local-certs" (OuterVolumeSpecName: "local-certs") pod "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6" (UID: "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:04.622779 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.622564 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6" (UID: "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:39:04.623094 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.623074 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6" (UID: "5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:39:04.721226 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.721189 2576 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-csr-ca-configmap\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:39:04.721226 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.721220 2576 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-cacerts\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:39:04.721226 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.721231 2576 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-local-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:39:04.721465 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.721240 2576 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-token\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:39:04.721465 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.721249 2576 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-kubeconfig\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:39:04.721465 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.721256 2576 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-istio-csr-dns-cert\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:39:04.721465 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:04.721265 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zv6pl\" (UniqueName: \"kubernetes.io/projected/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6-kube-api-access-zv6pl\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:39:05.176528 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:05.176492 2576 generic.go:358] "Generic (PLEG): container finished" podID="5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6" containerID="dd4dece8118e846c81dbe9c98e047117c8adfc7184f3091ebbb8e4a83ba54d1a" exitCode=0 Apr 16 18:39:05.176714 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:05.176561 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" event={"ID":"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6","Type":"ContainerDied","Data":"dd4dece8118e846c81dbe9c98e047117c8adfc7184f3091ebbb8e4a83ba54d1a"} Apr 16 18:39:05.176714 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:05.176562 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" Apr 16 18:39:05.176714 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:05.176598 2576 scope.go:117] "RemoveContainer" containerID="dd4dece8118e846c81dbe9c98e047117c8adfc7184f3091ebbb8e4a83ba54d1a" Apr 16 18:39:05.176714 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:05.176588 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn" event={"ID":"5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6","Type":"ContainerDied","Data":"e8b0c33f4c0af4632509304b49538151e66a91e408d55466ed480a27c7096880"} Apr 16 18:39:05.185539 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:05.185520 2576 scope.go:117] "RemoveContainer" containerID="dd4dece8118e846c81dbe9c98e047117c8adfc7184f3091ebbb8e4a83ba54d1a" Apr 16 18:39:05.185900 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:39:05.185875 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4dece8118e846c81dbe9c98e047117c8adfc7184f3091ebbb8e4a83ba54d1a\": container with ID starting with dd4dece8118e846c81dbe9c98e047117c8adfc7184f3091ebbb8e4a83ba54d1a not found: ID does not exist" containerID="dd4dece8118e846c81dbe9c98e047117c8adfc7184f3091ebbb8e4a83ba54d1a" Apr 16 18:39:05.185981 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:05.185913 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4dece8118e846c81dbe9c98e047117c8adfc7184f3091ebbb8e4a83ba54d1a"} err="failed to get container status \"dd4dece8118e846c81dbe9c98e047117c8adfc7184f3091ebbb8e4a83ba54d1a\": rpc error: code = NotFound desc = could not find container \"dd4dece8118e846c81dbe9c98e047117c8adfc7184f3091ebbb8e4a83ba54d1a\": container with ID starting with dd4dece8118e846c81dbe9c98e047117c8adfc7184f3091ebbb8e4a83ba54d1a not found: ID does not exist" Apr 16 18:39:05.212455 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:05.212419 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn"] Apr 16 18:39:05.218821 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:05.218793 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-87vdn"] Apr 16 18:39:06.505578 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:06.505536 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6" path="/var/lib/kubelet/pods/5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6/volumes" Apr 16 18:39:10.919623 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.919588 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-2rzmf"] Apr 16 18:39:10.920105 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.919943 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6" containerName="discovery" Apr 16 18:39:10.920105 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.919955 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6" containerName="discovery" Apr 16 18:39:10.920105 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.920007 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ec70e0b-ff33-4453-b5eb-a3a49a9c6aa6" containerName="discovery" Apr 16 18:39:10.923030 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.922995 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" Apr 16 18:39:10.926680 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.926657 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 18:39:10.926680 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.926665 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:39:10.927570 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.927554 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-5gclw\"" Apr 16 18:39:10.927620 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.927571 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:39:10.932352 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.932327 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-2rzmf"] Apr 16 18:39:10.942186 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.942160 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-fc44f49f-xtln4"] Apr 16 18:39:10.944845 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.944820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" Apr 16 18:39:10.947832 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.947807 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:39:10.947980 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.947961 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-zlbts\"" Apr 16 18:39:10.957653 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.957621 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-fc44f49f-xtln4"] Apr 16 18:39:10.966670 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.966639 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-mh7bj"] Apr 16 18:39:10.968960 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.968943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-mh7bj" Apr 16 18:39:10.971945 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.971913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtpz2\" (UniqueName: \"kubernetes.io/projected/4504c8c4-c68b-49e0-a8e0-67d7445674f9-kube-api-access-gtpz2\") pod \"llmisvc-controller-manager-fc44f49f-xtln4\" (UID: \"4504c8c4-c68b-49e0-a8e0-67d7445674f9\") " pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" Apr 16 18:39:10.972067 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.971980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aec32691-f3c1-4576-9dce-c2546809a15a-cert\") pod \"kserve-controller-manager-7c68cb4fc8-2rzmf\" (UID: \"aec32691-f3c1-4576-9dce-c2546809a15a\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" Apr 16 18:39:10.972145 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.972099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt4d7\" (UniqueName: \"kubernetes.io/projected/aec32691-f3c1-4576-9dce-c2546809a15a-kube-api-access-lt4d7\") pod \"kserve-controller-manager-7c68cb4fc8-2rzmf\" (UID: \"aec32691-f3c1-4576-9dce-c2546809a15a\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" Apr 16 18:39:10.972204 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.972168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4504c8c4-c68b-49e0-a8e0-67d7445674f9-cert\") pod \"llmisvc-controller-manager-fc44f49f-xtln4\" (UID: \"4504c8c4-c68b-49e0-a8e0-67d7445674f9\") " pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" Apr 16 18:39:10.972436 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.972419 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:39:10.972501 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.972467 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qb6xk\"" Apr 16 18:39:10.983701 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:10.983667 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-mh7bj"] Apr 16 18:39:11.073352 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.073291 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aec32691-f3c1-4576-9dce-c2546809a15a-cert\") pod \"kserve-controller-manager-7c68cb4fc8-2rzmf\" (UID: \"aec32691-f3c1-4576-9dce-c2546809a15a\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" Apr 16 18:39:11.073352 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.073355 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/45d14e57-1e63-45f6-aadc-bb15ad4ff226-data\") pod \"seaweedfs-86cc847c5c-mh7bj\" (UID: \"45d14e57-1e63-45f6-aadc-bb15ad4ff226\") " pod="kserve/seaweedfs-86cc847c5c-mh7bj" Apr 16 18:39:11.073591 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.073449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt4d7\" (UniqueName: \"kubernetes.io/projected/aec32691-f3c1-4576-9dce-c2546809a15a-kube-api-access-lt4d7\") pod \"kserve-controller-manager-7c68cb4fc8-2rzmf\" (UID: \"aec32691-f3c1-4576-9dce-c2546809a15a\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" Apr 16 18:39:11.073591 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.073506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69db4\" (UniqueName: \"kubernetes.io/projected/45d14e57-1e63-45f6-aadc-bb15ad4ff226-kube-api-access-69db4\") pod \"seaweedfs-86cc847c5c-mh7bj\" (UID: \"45d14e57-1e63-45f6-aadc-bb15ad4ff226\") " pod="kserve/seaweedfs-86cc847c5c-mh7bj" Apr 16 18:39:11.073591 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.073528 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4504c8c4-c68b-49e0-a8e0-67d7445674f9-cert\") pod \"llmisvc-controller-manager-fc44f49f-xtln4\" (UID: \"4504c8c4-c68b-49e0-a8e0-67d7445674f9\") " pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" Apr 16 18:39:11.073591 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.073555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtpz2\" (UniqueName: \"kubernetes.io/projected/4504c8c4-c68b-49e0-a8e0-67d7445674f9-kube-api-access-gtpz2\") pod \"llmisvc-controller-manager-fc44f49f-xtln4\" (UID: \"4504c8c4-c68b-49e0-a8e0-67d7445674f9\") " pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" Apr 16 18:39:11.073736 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:39:11.073646 2576 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 18:39:11.073736 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:39:11.073711 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4504c8c4-c68b-49e0-a8e0-67d7445674f9-cert podName:4504c8c4-c68b-49e0-a8e0-67d7445674f9 nodeName:}" failed. No retries permitted until 2026-04-16 18:39:11.573693768 +0000 UTC m=+477.698221409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4504c8c4-c68b-49e0-a8e0-67d7445674f9-cert") pod "llmisvc-controller-manager-fc44f49f-xtln4" (UID: "4504c8c4-c68b-49e0-a8e0-67d7445674f9") : secret "llmisvc-webhook-server-cert" not found Apr 16 18:39:11.075816 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.075794 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aec32691-f3c1-4576-9dce-c2546809a15a-cert\") pod \"kserve-controller-manager-7c68cb4fc8-2rzmf\" (UID: \"aec32691-f3c1-4576-9dce-c2546809a15a\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" Apr 16 18:39:11.085477 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.085442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt4d7\" (UniqueName: \"kubernetes.io/projected/aec32691-f3c1-4576-9dce-c2546809a15a-kube-api-access-lt4d7\") pod \"kserve-controller-manager-7c68cb4fc8-2rzmf\" (UID: \"aec32691-f3c1-4576-9dce-c2546809a15a\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" Apr 16 18:39:11.085612 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.085563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtpz2\" (UniqueName: \"kubernetes.io/projected/4504c8c4-c68b-49e0-a8e0-67d7445674f9-kube-api-access-gtpz2\") pod \"llmisvc-controller-manager-fc44f49f-xtln4\" (UID: \"4504c8c4-c68b-49e0-a8e0-67d7445674f9\") " pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" Apr 16 18:39:11.174349 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.174248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/45d14e57-1e63-45f6-aadc-bb15ad4ff226-data\") pod \"seaweedfs-86cc847c5c-mh7bj\" (UID: \"45d14e57-1e63-45f6-aadc-bb15ad4ff226\") " pod="kserve/seaweedfs-86cc847c5c-mh7bj" Apr 16 18:39:11.174349 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.174346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69db4\" (UniqueName: \"kubernetes.io/projected/45d14e57-1e63-45f6-aadc-bb15ad4ff226-kube-api-access-69db4\") pod \"seaweedfs-86cc847c5c-mh7bj\" (UID: \"45d14e57-1e63-45f6-aadc-bb15ad4ff226\") " pod="kserve/seaweedfs-86cc847c5c-mh7bj" Apr 16 18:39:11.174664 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.174640 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/45d14e57-1e63-45f6-aadc-bb15ad4ff226-data\") pod \"seaweedfs-86cc847c5c-mh7bj\" (UID: \"45d14e57-1e63-45f6-aadc-bb15ad4ff226\") " pod="kserve/seaweedfs-86cc847c5c-mh7bj" Apr 16 18:39:11.183411 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.183385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69db4\" (UniqueName: \"kubernetes.io/projected/45d14e57-1e63-45f6-aadc-bb15ad4ff226-kube-api-access-69db4\") pod \"seaweedfs-86cc847c5c-mh7bj\" (UID: \"45d14e57-1e63-45f6-aadc-bb15ad4ff226\") " pod="kserve/seaweedfs-86cc847c5c-mh7bj" Apr 16 18:39:11.236038 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.235993 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" Apr 16 18:39:11.281357 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.281326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-mh7bj" Apr 16 18:39:11.370919 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.370881 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-2rzmf"] Apr 16 18:39:11.373522 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:39:11.373485 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaec32691_f3c1_4576_9dce_c2546809a15a.slice/crio-6d32718788ec5ac7ecf8af192efe028e2b598421f89742f23edd5bc00e30f754 WatchSource:0}: Error finding container 6d32718788ec5ac7ecf8af192efe028e2b598421f89742f23edd5bc00e30f754: Status 404 returned error can't find the container with id 6d32718788ec5ac7ecf8af192efe028e2b598421f89742f23edd5bc00e30f754 Apr 16 18:39:11.417598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.417574 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-mh7bj"] Apr 16 18:39:11.419114 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:39:11.419090 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45d14e57_1e63_45f6_aadc_bb15ad4ff226.slice/crio-aa18b52338dec7c68e7f73fafb7ad446c9df4705b5110f6382a311c842cd7655 WatchSource:0}: Error finding container aa18b52338dec7c68e7f73fafb7ad446c9df4705b5110f6382a311c842cd7655: Status 404 returned error can't find the container with id aa18b52338dec7c68e7f73fafb7ad446c9df4705b5110f6382a311c842cd7655 Apr 16 18:39:11.579348 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.579305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4504c8c4-c68b-49e0-a8e0-67d7445674f9-cert\") pod \"llmisvc-controller-manager-fc44f49f-xtln4\" (UID: \"4504c8c4-c68b-49e0-a8e0-67d7445674f9\") " pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" Apr 16 18:39:11.581808 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.581788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4504c8c4-c68b-49e0-a8e0-67d7445674f9-cert\") pod \"llmisvc-controller-manager-fc44f49f-xtln4\" (UID: \"4504c8c4-c68b-49e0-a8e0-67d7445674f9\") " pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" Apr 16 18:39:11.854274 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:11.854191 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" Apr 16 18:39:12.031731 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:12.031701 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-fc44f49f-xtln4"] Apr 16 18:39:12.034406 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:39:12.034365 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4504c8c4_c68b_49e0_a8e0_67d7445674f9.slice/crio-73ade4c7872643f0be573814de78eb0f832c107339dac707d2878d48ea3ac98c WatchSource:0}: Error finding container 73ade4c7872643f0be573814de78eb0f832c107339dac707d2878d48ea3ac98c: Status 404 returned error can't find the container with id 73ade4c7872643f0be573814de78eb0f832c107339dac707d2878d48ea3ac98c Apr 16 18:39:12.210677 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:12.210632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-mh7bj" event={"ID":"45d14e57-1e63-45f6-aadc-bb15ad4ff226","Type":"ContainerStarted","Data":"aa18b52338dec7c68e7f73fafb7ad446c9df4705b5110f6382a311c842cd7655"} Apr 16 18:39:12.213213 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:12.213180 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" event={"ID":"4504c8c4-c68b-49e0-a8e0-67d7445674f9","Type":"ContainerStarted","Data":"73ade4c7872643f0be573814de78eb0f832c107339dac707d2878d48ea3ac98c"} Apr 16 18:39:12.214638 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:12.214604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" event={"ID":"aec32691-f3c1-4576-9dce-c2546809a15a","Type":"ContainerStarted","Data":"6d32718788ec5ac7ecf8af192efe028e2b598421f89742f23edd5bc00e30f754"} Apr 16 18:39:15.570132 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:15.570103 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:39:16.232533 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:16.232490 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-mh7bj" event={"ID":"45d14e57-1e63-45f6-aadc-bb15ad4ff226","Type":"ContainerStarted","Data":"28e3407761402300e380cf57ae4c096068f65740e203fca159c8e57922329d59"} Apr 16 18:39:16.232704 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:16.232569 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-mh7bj" Apr 16 18:39:16.234011 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:16.233989 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" event={"ID":"aec32691-f3c1-4576-9dce-c2546809a15a","Type":"ContainerStarted","Data":"fb2890663a147cb15db2775683b45adf11f9fc0c19385a88be535fd8081153c4"} Apr 16 18:39:16.234126 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:16.234102 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" Apr 16 18:39:16.251583 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:16.251520 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-mh7bj" podStartSLOduration=2.104505099 podStartE2EDuration="6.251504082s" podCreationTimestamp="2026-04-16 18:39:10 +0000 UTC" firstStartedPulling="2026-04-16 18:39:11.420366789 +0000 UTC m=+477.544894429" lastFinishedPulling="2026-04-16 18:39:15.567365769 +0000 UTC m=+481.691893412" observedRunningTime="2026-04-16 18:39:16.250050554 +0000 UTC m=+482.374578219" watchObservedRunningTime="2026-04-16 18:39:16.251504082 +0000 UTC m=+482.376031744" Apr 16 18:39:16.268587 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:16.268535 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" podStartSLOduration=2.622790758 podStartE2EDuration="6.268506518s" podCreationTimestamp="2026-04-16 18:39:10 +0000 UTC" firstStartedPulling="2026-04-16 18:39:11.37481371 +0000 UTC m=+477.499341353" lastFinishedPulling="2026-04-16 18:39:15.020529472 +0000 UTC m=+481.145057113" observedRunningTime="2026-04-16 18:39:16.267716846 +0000 UTC m=+482.392244509" watchObservedRunningTime="2026-04-16 18:39:16.268506518 +0000 UTC m=+482.393034180" Apr 16 18:39:18.245766 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:18.245711 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" event={"ID":"4504c8c4-c68b-49e0-a8e0-67d7445674f9","Type":"ContainerStarted","Data":"bb464e48ad053c98756474189b99f3f835a517d79b0fdfd02f7d30ffb79e3bf9"} Apr 16 18:39:18.246123 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:18.245880 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" Apr 16 18:39:18.265933 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:18.265882 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" podStartSLOduration=2.908530188 podStartE2EDuration="8.265865817s" podCreationTimestamp="2026-04-16 18:39:10 +0000 UTC" firstStartedPulling="2026-04-16 18:39:12.036163767 +0000 UTC m=+478.160691414" lastFinishedPulling="2026-04-16 18:39:17.393499402 +0000 UTC m=+483.518027043" observedRunningTime="2026-04-16 18:39:18.263617629 +0000 UTC m=+484.388145292" watchObservedRunningTime="2026-04-16 18:39:18.265865817 +0000 UTC m=+484.390393480" Apr 16 18:39:22.240924 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:22.240894 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-mh7bj" Apr 16 18:39:47.243965 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:47.243935 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" Apr 16 18:39:49.251773 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:49.251713 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" Apr 16 18:39:50.682147 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:50.682104 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-2rzmf"] Apr 16 18:39:50.682559 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:50.682335 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" podUID="aec32691-f3c1-4576-9dce-c2546809a15a" containerName="manager" containerID="cri-o://fb2890663a147cb15db2775683b45adf11f9fc0c19385a88be535fd8081153c4" gracePeriod=10 Apr 16 18:39:50.718002 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:50.717968 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-9b8lj"] Apr 16 18:39:50.720926 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:50.720909 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-9b8lj" Apr 16 18:39:50.733510 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:50.733477 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-9b8lj"] Apr 16 18:39:50.815334 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:50.815293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p8k8\" (UniqueName: \"kubernetes.io/projected/d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04-kube-api-access-4p8k8\") pod \"kserve-controller-manager-7c68cb4fc8-9b8lj\" (UID: \"d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-9b8lj" Apr 16 18:39:50.815511 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:50.815419 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04-cert\") pod \"kserve-controller-manager-7c68cb4fc8-9b8lj\" (UID: \"d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-9b8lj" Apr 16 18:39:50.915826 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:50.915735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04-cert\") pod \"kserve-controller-manager-7c68cb4fc8-9b8lj\" (UID: \"d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-9b8lj" Apr 16 18:39:50.915997 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:50.915899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p8k8\" (UniqueName: \"kubernetes.io/projected/d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04-kube-api-access-4p8k8\") pod \"kserve-controller-manager-7c68cb4fc8-9b8lj\" (UID: \"d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-9b8lj" Apr 16 18:39:50.917077 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:50.917059 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" Apr 16 18:39:50.918241 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:50.918222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04-cert\") pod \"kserve-controller-manager-7c68cb4fc8-9b8lj\" (UID: \"d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-9b8lj" Apr 16 18:39:50.933149 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:50.933082 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p8k8\" (UniqueName: \"kubernetes.io/projected/d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04-kube-api-access-4p8k8\") pod \"kserve-controller-manager-7c68cb4fc8-9b8lj\" (UID: \"d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-9b8lj" Apr 16 18:39:51.016958 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.016922 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt4d7\" (UniqueName: \"kubernetes.io/projected/aec32691-f3c1-4576-9dce-c2546809a15a-kube-api-access-lt4d7\") pod \"aec32691-f3c1-4576-9dce-c2546809a15a\" (UID: \"aec32691-f3c1-4576-9dce-c2546809a15a\") " Apr 16 18:39:51.017122 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.017009 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aec32691-f3c1-4576-9dce-c2546809a15a-cert\") pod \"aec32691-f3c1-4576-9dce-c2546809a15a\" (UID: \"aec32691-f3c1-4576-9dce-c2546809a15a\") " Apr 16 18:39:51.019077 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.019044 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec32691-f3c1-4576-9dce-c2546809a15a-kube-api-access-lt4d7" (OuterVolumeSpecName: "kube-api-access-lt4d7") pod "aec32691-f3c1-4576-9dce-c2546809a15a" (UID: "aec32691-f3c1-4576-9dce-c2546809a15a"). InnerVolumeSpecName "kube-api-access-lt4d7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:39:51.019077 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.019048 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec32691-f3c1-4576-9dce-c2546809a15a-cert" (OuterVolumeSpecName: "cert") pod "aec32691-f3c1-4576-9dce-c2546809a15a" (UID: "aec32691-f3c1-4576-9dce-c2546809a15a"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:39:51.075322 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.075285 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-9b8lj" Apr 16 18:39:51.118250 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.118210 2576 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aec32691-f3c1-4576-9dce-c2546809a15a-cert\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:39:51.118250 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.118243 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lt4d7\" (UniqueName: \"kubernetes.io/projected/aec32691-f3c1-4576-9dce-c2546809a15a-kube-api-access-lt4d7\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:39:51.204080 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.204050 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-9b8lj"] Apr 16 18:39:51.205725 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:39:51.205698 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0c3a4db_0a29_4ca7_bb64_1a36ca5b4b04.slice/crio-291de221652931a7d84d6a317422cf48b135f692ba2db323a6b169dffe27be9b WatchSource:0}: Error finding container 291de221652931a7d84d6a317422cf48b135f692ba2db323a6b169dffe27be9b: Status 404 returned error can't find the container with id 291de221652931a7d84d6a317422cf48b135f692ba2db323a6b169dffe27be9b Apr 16 18:39:51.365776 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.365713 2576 generic.go:358] "Generic (PLEG): container finished" podID="aec32691-f3c1-4576-9dce-c2546809a15a" containerID="fb2890663a147cb15db2775683b45adf11f9fc0c19385a88be535fd8081153c4" exitCode=0 Apr 16 18:39:51.365957 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.365780 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" event={"ID":"aec32691-f3c1-4576-9dce-c2546809a15a","Type":"ContainerDied","Data":"fb2890663a147cb15db2775683b45adf11f9fc0c19385a88be535fd8081153c4"} Apr 16 18:39:51.365957 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.365825 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" event={"ID":"aec32691-f3c1-4576-9dce-c2546809a15a","Type":"ContainerDied","Data":"6d32718788ec5ac7ecf8af192efe028e2b598421f89742f23edd5bc00e30f754"} Apr 16 18:39:51.365957 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.365833 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-2rzmf" Apr 16 18:39:51.365957 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.365840 2576 scope.go:117] "RemoveContainer" containerID="fb2890663a147cb15db2775683b45adf11f9fc0c19385a88be535fd8081153c4" Apr 16 18:39:51.367229 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.367202 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-9b8lj" event={"ID":"d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04","Type":"ContainerStarted","Data":"291de221652931a7d84d6a317422cf48b135f692ba2db323a6b169dffe27be9b"} Apr 16 18:39:51.374685 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.374663 2576 scope.go:117] "RemoveContainer" containerID="fb2890663a147cb15db2775683b45adf11f9fc0c19385a88be535fd8081153c4" Apr 16 18:39:51.374993 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:39:51.374971 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb2890663a147cb15db2775683b45adf11f9fc0c19385a88be535fd8081153c4\": container with ID starting with fb2890663a147cb15db2775683b45adf11f9fc0c19385a88be535fd8081153c4 not found: ID does not exist" containerID="fb2890663a147cb15db2775683b45adf11f9fc0c19385a88be535fd8081153c4" Apr 16 18:39:51.375052 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.375002 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2890663a147cb15db2775683b45adf11f9fc0c19385a88be535fd8081153c4"} err="failed to get container status \"fb2890663a147cb15db2775683b45adf11f9fc0c19385a88be535fd8081153c4\": rpc error: code = NotFound desc = could not find container \"fb2890663a147cb15db2775683b45adf11f9fc0c19385a88be535fd8081153c4\": container with ID starting with fb2890663a147cb15db2775683b45adf11f9fc0c19385a88be535fd8081153c4 not found: ID does not exist" Apr 16 18:39:51.390493 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.390461 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-2rzmf"] Apr 16 18:39:51.392623 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:51.392598 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-2rzmf"] Apr 16 18:39:52.377519 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:52.377482 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-9b8lj" event={"ID":"d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04","Type":"ContainerStarted","Data":"7fff5aa8faf70ce6758c5f1ca9fea23d14fae63c1759a6267da80294efbaf1fb"} Apr 16 18:39:52.377990 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:52.377701 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-9b8lj" Apr 16 18:39:52.394851 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:52.394797 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7c68cb4fc8-9b8lj" podStartSLOduration=2.021051391 podStartE2EDuration="2.394780531s" podCreationTimestamp="2026-04-16 18:39:50 +0000 UTC" firstStartedPulling="2026-04-16 18:39:51.206928212 +0000 UTC m=+517.331455854" lastFinishedPulling="2026-04-16 18:39:51.580657339 +0000 UTC m=+517.705184994" observedRunningTime="2026-04-16 18:39:52.393886499 +0000 UTC m=+518.518414163" watchObservedRunningTime="2026-04-16 18:39:52.394780531 +0000 UTC m=+518.519308194" Apr 16 18:39:52.506364 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:39:52.506324 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec32691-f3c1-4576-9dce-c2546809a15a" path="/var/lib/kubelet/pods/aec32691-f3c1-4576-9dce-c2546809a15a/volumes" Apr 16 18:40:23.386627 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:23.386592 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-9b8lj" Apr 16 18:40:24.308644 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.308607 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-wbz6b"] Apr 16 18:40:24.308986 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.308972 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aec32691-f3c1-4576-9dce-c2546809a15a" containerName="manager" Apr 16 18:40:24.309039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.308987 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec32691-f3c1-4576-9dce-c2546809a15a" containerName="manager" Apr 16 18:40:24.309074 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.309058 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="aec32691-f3c1-4576-9dce-c2546809a15a" containerName="manager" Apr 16 18:40:24.312130 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.312113 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-wbz6b" Apr 16 18:40:24.315360 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.315334 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 18:40:24.315497 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.315403 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-2l7zc\"" Apr 16 18:40:24.322634 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.322605 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-wbz6b"] Apr 16 18:40:24.390944 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.390908 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4de7741-e917-4daa-8942-198dcf0f92ea-cert\") pod \"odh-model-controller-696fc77849-wbz6b\" (UID: \"f4de7741-e917-4daa-8942-198dcf0f92ea\") " pod="kserve/odh-model-controller-696fc77849-wbz6b" Apr 16 18:40:24.391317 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.390988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lkht\" (UniqueName: \"kubernetes.io/projected/f4de7741-e917-4daa-8942-198dcf0f92ea-kube-api-access-2lkht\") pod \"odh-model-controller-696fc77849-wbz6b\" (UID: \"f4de7741-e917-4daa-8942-198dcf0f92ea\") " pod="kserve/odh-model-controller-696fc77849-wbz6b" Apr 16 18:40:24.491661 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.491625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lkht\" (UniqueName: \"kubernetes.io/projected/f4de7741-e917-4daa-8942-198dcf0f92ea-kube-api-access-2lkht\") pod \"odh-model-controller-696fc77849-wbz6b\" (UID: \"f4de7741-e917-4daa-8942-198dcf0f92ea\") " pod="kserve/odh-model-controller-696fc77849-wbz6b" Apr 16 18:40:24.491854 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.491678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4de7741-e917-4daa-8942-198dcf0f92ea-cert\") pod \"odh-model-controller-696fc77849-wbz6b\" (UID: \"f4de7741-e917-4daa-8942-198dcf0f92ea\") " pod="kserve/odh-model-controller-696fc77849-wbz6b" Apr 16 18:40:24.491854 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:40:24.491825 2576 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 18:40:24.491942 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:40:24.491888 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4de7741-e917-4daa-8942-198dcf0f92ea-cert podName:f4de7741-e917-4daa-8942-198dcf0f92ea nodeName:}" failed. No retries permitted until 2026-04-16 18:40:24.991872838 +0000 UTC m=+551.116400480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f4de7741-e917-4daa-8942-198dcf0f92ea-cert") pod "odh-model-controller-696fc77849-wbz6b" (UID: "f4de7741-e917-4daa-8942-198dcf0f92ea") : secret "odh-model-controller-webhook-cert" not found Apr 16 18:40:24.503708 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.503676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lkht\" (UniqueName: \"kubernetes.io/projected/f4de7741-e917-4daa-8942-198dcf0f92ea-kube-api-access-2lkht\") pod \"odh-model-controller-696fc77849-wbz6b\" (UID: \"f4de7741-e917-4daa-8942-198dcf0f92ea\") " pod="kserve/odh-model-controller-696fc77849-wbz6b" Apr 16 18:40:24.995312 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.995254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4de7741-e917-4daa-8942-198dcf0f92ea-cert\") pod \"odh-model-controller-696fc77849-wbz6b\" (UID: \"f4de7741-e917-4daa-8942-198dcf0f92ea\") " pod="kserve/odh-model-controller-696fc77849-wbz6b" Apr 16 18:40:24.997671 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:24.997649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4de7741-e917-4daa-8942-198dcf0f92ea-cert\") pod \"odh-model-controller-696fc77849-wbz6b\" (UID: \"f4de7741-e917-4daa-8942-198dcf0f92ea\") " pod="kserve/odh-model-controller-696fc77849-wbz6b" Apr 16 18:40:25.224057 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:25.224021 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-wbz6b" Apr 16 18:40:25.354577 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:25.354548 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-wbz6b"] Apr 16 18:40:25.356288 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:40:25.356262 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4de7741_e917_4daa_8942_198dcf0f92ea.slice/crio-fc0017301cc373ac4e2b3ff333b5a0c3ad0122f6b2544307eb393c4d16f28ee5 WatchSource:0}: Error finding container fc0017301cc373ac4e2b3ff333b5a0c3ad0122f6b2544307eb393c4d16f28ee5: Status 404 returned error can't find the container with id fc0017301cc373ac4e2b3ff333b5a0c3ad0122f6b2544307eb393c4d16f28ee5 Apr 16 18:40:25.500330 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:25.500294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-wbz6b" event={"ID":"f4de7741-e917-4daa-8942-198dcf0f92ea","Type":"ContainerStarted","Data":"fc0017301cc373ac4e2b3ff333b5a0c3ad0122f6b2544307eb393c4d16f28ee5"} Apr 16 18:40:28.514897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:28.514862 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-wbz6b" event={"ID":"f4de7741-e917-4daa-8942-198dcf0f92ea","Type":"ContainerStarted","Data":"d25613aca75dba6c091879b1462b8341cbccb423cf8beba26212d282ea4e32b8"} Apr 16 18:40:28.515310 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:28.515009 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-wbz6b" Apr 16 18:40:28.537284 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:28.537228 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-wbz6b" podStartSLOduration=1.992584527 podStartE2EDuration="4.537209985s" podCreationTimestamp="2026-04-16 18:40:24 +0000 UTC" firstStartedPulling="2026-04-16 18:40:25.357606905 +0000 UTC m=+551.482134547" lastFinishedPulling="2026-04-16 18:40:27.902232163 +0000 UTC m=+554.026760005" observedRunningTime="2026-04-16 18:40:28.535839144 +0000 UTC m=+554.660366810" watchObservedRunningTime="2026-04-16 18:40:28.537209985 +0000 UTC m=+554.661737649" Apr 16 18:40:39.520865 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:39.520832 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-wbz6b" Apr 16 18:40:40.354463 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:40.354429 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-p545z"] Apr 16 18:40:40.357565 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:40.357548 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-p545z" Apr 16 18:40:40.365762 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:40.365718 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-p545z"] Apr 16 18:40:40.535551 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:40.535516 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckr7s\" (UniqueName: \"kubernetes.io/projected/eeb7972a-7f87-4db5-a779-0dcd6a9804a3-kube-api-access-ckr7s\") pod \"s3-init-p545z\" (UID: \"eeb7972a-7f87-4db5-a779-0dcd6a9804a3\") " pod="kserve/s3-init-p545z" Apr 16 18:40:40.636595 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:40.636486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckr7s\" (UniqueName: \"kubernetes.io/projected/eeb7972a-7f87-4db5-a779-0dcd6a9804a3-kube-api-access-ckr7s\") pod \"s3-init-p545z\" (UID: \"eeb7972a-7f87-4db5-a779-0dcd6a9804a3\") " pod="kserve/s3-init-p545z" Apr 16 18:40:40.645861 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:40.645827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckr7s\" (UniqueName: \"kubernetes.io/projected/eeb7972a-7f87-4db5-a779-0dcd6a9804a3-kube-api-access-ckr7s\") pod \"s3-init-p545z\" (UID: \"eeb7972a-7f87-4db5-a779-0dcd6a9804a3\") " pod="kserve/s3-init-p545z" Apr 16 18:40:40.667678 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:40.667642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-p545z" Apr 16 18:40:40.796236 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:40.796211 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-p545z"] Apr 16 18:40:40.798470 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:40:40.798444 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeeb7972a_7f87_4db5_a779_0dcd6a9804a3.slice/crio-fca5512cfa5ed4cdc47e760904130b1b389496910aae831ce1b1bf4f1340eb41 WatchSource:0}: Error finding container fca5512cfa5ed4cdc47e760904130b1b389496910aae831ce1b1bf4f1340eb41: Status 404 returned error can't find the container with id fca5512cfa5ed4cdc47e760904130b1b389496910aae831ce1b1bf4f1340eb41 Apr 16 18:40:41.565230 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:41.565194 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-p545z" event={"ID":"eeb7972a-7f87-4db5-a779-0dcd6a9804a3","Type":"ContainerStarted","Data":"fca5512cfa5ed4cdc47e760904130b1b389496910aae831ce1b1bf4f1340eb41"} Apr 16 18:40:45.584033 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:45.583941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-p545z" event={"ID":"eeb7972a-7f87-4db5-a779-0dcd6a9804a3","Type":"ContainerStarted","Data":"05b3d7d5de81617fadaf6fe1876181e930b8bd6e66d443aac7033602dc61cb06"} Apr 16 18:40:45.616897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:45.616849 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-p545z" podStartSLOduration=1.1378786 podStartE2EDuration="5.616829245s" podCreationTimestamp="2026-04-16 18:40:40 +0000 UTC" firstStartedPulling="2026-04-16 18:40:40.800120288 +0000 UTC m=+566.924647928" lastFinishedPulling="2026-04-16 18:40:45.279070933 +0000 UTC m=+571.403598573" observedRunningTime="2026-04-16 18:40:45.61679389 +0000 UTC m=+571.741321554" watchObservedRunningTime="2026-04-16 18:40:45.616829245 +0000 UTC m=+571.741356909" Apr 16 18:40:48.596478 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:48.596440 2576 generic.go:358] "Generic (PLEG): container finished" podID="eeb7972a-7f87-4db5-a779-0dcd6a9804a3" containerID="05b3d7d5de81617fadaf6fe1876181e930b8bd6e66d443aac7033602dc61cb06" exitCode=0 Apr 16 18:40:48.596889 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:48.596513 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-p545z" event={"ID":"eeb7972a-7f87-4db5-a779-0dcd6a9804a3","Type":"ContainerDied","Data":"05b3d7d5de81617fadaf6fe1876181e930b8bd6e66d443aac7033602dc61cb06"} Apr 16 18:40:49.733949 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:49.733923 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-p545z" Apr 16 18:40:49.818789 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:49.818731 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckr7s\" (UniqueName: \"kubernetes.io/projected/eeb7972a-7f87-4db5-a779-0dcd6a9804a3-kube-api-access-ckr7s\") pod \"eeb7972a-7f87-4db5-a779-0dcd6a9804a3\" (UID: \"eeb7972a-7f87-4db5-a779-0dcd6a9804a3\") " Apr 16 18:40:49.820869 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:49.820843 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb7972a-7f87-4db5-a779-0dcd6a9804a3-kube-api-access-ckr7s" (OuterVolumeSpecName: "kube-api-access-ckr7s") pod "eeb7972a-7f87-4db5-a779-0dcd6a9804a3" (UID: "eeb7972a-7f87-4db5-a779-0dcd6a9804a3"). InnerVolumeSpecName "kube-api-access-ckr7s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:40:49.920301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:49.920204 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ckr7s\" (UniqueName: \"kubernetes.io/projected/eeb7972a-7f87-4db5-a779-0dcd6a9804a3-kube-api-access-ckr7s\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:40:50.605105 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:50.605080 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-p545z" Apr 16 18:40:50.605105 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:50.605107 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-p545z" event={"ID":"eeb7972a-7f87-4db5-a779-0dcd6a9804a3","Type":"ContainerDied","Data":"fca5512cfa5ed4cdc47e760904130b1b389496910aae831ce1b1bf4f1340eb41"} Apr 16 18:40:50.605322 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:40:50.605136 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fca5512cfa5ed4cdc47e760904130b1b389496910aae831ce1b1bf4f1340eb41" Apr 16 18:41:01.005190 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.005150 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t"] Apr 16 18:41:01.005679 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.005658 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eeb7972a-7f87-4db5-a779-0dcd6a9804a3" containerName="s3-init" Apr 16 18:41:01.005835 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.005682 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb7972a-7f87-4db5-a779-0dcd6a9804a3" containerName="s3-init" Apr 16 18:41:01.005835 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.005816 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="eeb7972a-7f87-4db5-a779-0dcd6a9804a3" containerName="s3-init" Apr 16 18:41:01.013682 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.013652 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.016896 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.016851 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:41:01.016896 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.016885 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-qnz4k\"" Apr 16 18:41:01.017377 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.017096 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:41:01.017377 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.017201 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 18:41:01.023796 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.023715 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t"] Apr 16 18:41:01.121052 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.121012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.121254 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.121064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.121254 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.121132 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/432c6f05-8a90-4233-b480-3acc0d695596-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.121254 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.121179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.121254 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.121210 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.121459 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.121257 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/432c6f05-8a90-4233-b480-3acc0d695596-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.121459 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.121285 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.121459 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.121316 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/432c6f05-8a90-4233-b480-3acc0d695596-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.121459 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.121357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdk8w\" (UniqueName: \"kubernetes.io/projected/432c6f05-8a90-4233-b480-3acc0d695596-kube-api-access-zdk8w\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.221972 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.221927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.222170 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.221979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.222170 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.222010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/432c6f05-8a90-4233-b480-3acc0d695596-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.222170 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.222053 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.222170 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.222084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.222170 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.222119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/432c6f05-8a90-4233-b480-3acc0d695596-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.222170 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.222149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.222483 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.222178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/432c6f05-8a90-4233-b480-3acc0d695596-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.222483 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.222210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdk8w\" (UniqueName: \"kubernetes.io/projected/432c6f05-8a90-4233-b480-3acc0d695596-kube-api-access-zdk8w\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.222483 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.222414 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.222717 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.222693 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.222812 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.222718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.222975 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.222953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/432c6f05-8a90-4233-b480-3acc0d695596-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.223061 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.223038 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.224806 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.224772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/432c6f05-8a90-4233-b480-3acc0d695596-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.224972 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.224948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/432c6f05-8a90-4233-b480-3acc0d695596-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.231339 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.231304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/432c6f05-8a90-4233-b480-3acc0d695596-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.231849 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.231804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdk8w\" (UniqueName: \"kubernetes.io/projected/432c6f05-8a90-4233-b480-3acc0d695596-kube-api-access-zdk8w\") pod \"router-gateway-1-openshift-default-6c59fbf55c-7zf2t\" (UID: \"432c6f05-8a90-4233-b480-3acc0d695596\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.329366 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.329274 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:01.483642 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.483614 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t"] Apr 16 18:41:01.485609 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:41:01.485584 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod432c6f05_8a90_4233_b480_3acc0d695596.slice/crio-bb9fafb982eed606d3f2a2c0c504a1a0a5ce68478ae1f8c8e05784ccb8d4ccd5 WatchSource:0}: Error finding container bb9fafb982eed606d3f2a2c0c504a1a0a5ce68478ae1f8c8e05784ccb8d4ccd5: Status 404 returned error can't find the container with id bb9fafb982eed606d3f2a2c0c504a1a0a5ce68478ae1f8c8e05784ccb8d4ccd5 Apr 16 18:41:01.489585 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.488537 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:41:01.489585 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.488619 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:41:01.489585 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.488660 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:41:01.649260 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.649155 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" event={"ID":"432c6f05-8a90-4233-b480-3acc0d695596","Type":"ContainerStarted","Data":"fc663f28ab90699ed0c3252ff5d72bd6b12bb50d40f9425d6a14258298d12672"} Apr 16 18:41:01.649260 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.649204 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" event={"ID":"432c6f05-8a90-4233-b480-3acc0d695596","Type":"ContainerStarted","Data":"bb9fafb982eed606d3f2a2c0c504a1a0a5ce68478ae1f8c8e05784ccb8d4ccd5"} Apr 16 18:41:01.675607 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:01.675545 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" podStartSLOduration=1.6755239629999998 podStartE2EDuration="1.675523963s" podCreationTimestamp="2026-04-16 18:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:41:01.673340995 +0000 UTC m=+587.797868671" watchObservedRunningTime="2026-04-16 18:41:01.675523963 +0000 UTC m=+587.800051628" Apr 16 18:41:02.330488 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:02.330443 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:02.332235 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:02.332180 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" podUID="432c6f05-8a90-4233-b480-3acc0d695596" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.40:15021/healthz/ready\": dial tcp 10.134.0.40:15021: connect: connection refused" Apr 16 18:41:03.333663 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:03.333632 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:03.657863 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:03.657769 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:03.658890 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:03.658872 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-7zf2t" Apr 16 18:41:07.991857 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:07.991820 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m"] Apr 16 18:41:07.995477 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:07.995457 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.001588 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.001563 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xbj66\"" Apr 16 18:41:08.001725 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.001627 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 18:41:08.008006 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.007978 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m"] Apr 16 18:41:08.083236 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.083195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-dshm\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.083416 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.083257 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhccz\" (UniqueName: \"kubernetes.io/projected/36fa3c3c-b531-470b-9e94-f42c92def1e5-kube-api-access-zhccz\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.083416 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.083337 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36fa3c3c-b531-470b-9e94-f42c92def1e5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.083416 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.083369 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-home\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.083523 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.083444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.083523 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.083473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-model-cache\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.184462 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.184428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhccz\" (UniqueName: \"kubernetes.io/projected/36fa3c3c-b531-470b-9e94-f42c92def1e5-kube-api-access-zhccz\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.184692 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.184476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36fa3c3c-b531-470b-9e94-f42c92def1e5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.184692 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.184498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-home\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.184692 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.184544 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.184692 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.184569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-model-cache\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.184692 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.184594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-dshm\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.185056 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.185031 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.185180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.185056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-home\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.185180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.185129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-model-cache\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.186804 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.186783 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-dshm\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.187048 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.187032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36fa3c3c-b531-470b-9e94-f42c92def1e5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.194299 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.194277 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhccz\" (UniqueName: \"kubernetes.io/projected/36fa3c3c-b531-470b-9e94-f42c92def1e5-kube-api-access-zhccz\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.308627 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.308540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:08.455920 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.455891 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m"] Apr 16 18:41:08.457733 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:41:08.457709 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36fa3c3c_b531_470b_9e94_f42c92def1e5.slice/crio-0668c8a40f1302350d7e3a233f77cb2f9a6594675e2dc6d4a695ff3ae326d30b WatchSource:0}: Error finding container 0668c8a40f1302350d7e3a233f77cb2f9a6594675e2dc6d4a695ff3ae326d30b: Status 404 returned error can't find the container with id 0668c8a40f1302350d7e3a233f77cb2f9a6594675e2dc6d4a695ff3ae326d30b Apr 16 18:41:08.678179 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:08.678091 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" event={"ID":"36fa3c3c-b531-470b-9e94-f42c92def1e5","Type":"ContainerStarted","Data":"0668c8a40f1302350d7e3a233f77cb2f9a6594675e2dc6d4a695ff3ae326d30b"} Apr 16 18:41:12.697599 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:12.697558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" event={"ID":"36fa3c3c-b531-470b-9e94-f42c92def1e5","Type":"ContainerStarted","Data":"66b001b5ee99e1216e8b323d38a1e282082775b780f0587b58db5d64e1ecc967"} Apr 16 18:41:14.438155 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:14.438110 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/2.log" Apr 16 18:41:14.439048 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:14.439025 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/2.log" Apr 16 18:41:16.716687 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:16.716653 2576 generic.go:358] "Generic (PLEG): container finished" podID="36fa3c3c-b531-470b-9e94-f42c92def1e5" containerID="66b001b5ee99e1216e8b323d38a1e282082775b780f0587b58db5d64e1ecc967" exitCode=0 Apr 16 18:41:16.717111 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:16.716711 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" event={"ID":"36fa3c3c-b531-470b-9e94-f42c92def1e5","Type":"ContainerDied","Data":"66b001b5ee99e1216e8b323d38a1e282082775b780f0587b58db5d64e1ecc967"} Apr 16 18:41:18.727903 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:18.727866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" event={"ID":"36fa3c3c-b531-470b-9e94-f42c92def1e5","Type":"ContainerStarted","Data":"9e67bd9207a2fd0a7883170cc948a763b5598c9e1e75adbac30c5364627d5687"} Apr 16 18:41:18.750415 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:18.750356 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" podStartSLOduration=2.386901947 podStartE2EDuration="11.750339677s" podCreationTimestamp="2026-04-16 18:41:07 +0000 UTC" firstStartedPulling="2026-04-16 18:41:08.459658536 +0000 UTC m=+594.584186176" lastFinishedPulling="2026-04-16 18:41:17.823096265 +0000 UTC m=+603.947623906" observedRunningTime="2026-04-16 18:41:18.748063894 +0000 UTC m=+604.872591572" watchObservedRunningTime="2026-04-16 18:41:18.750339677 +0000 UTC m=+604.874867340" Apr 16 18:41:27.064325 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.064292 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld"] Apr 16 18:41:27.088021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.087974 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld"] Apr 16 18:41:27.088213 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.088117 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.091675 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.091650 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 16 18:41:27.284274 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.284241 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kplb\" (UniqueName: \"kubernetes.io/projected/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-kube-api-access-8kplb\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.284489 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.284298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.284489 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.284353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.284489 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.284373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.284489 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.284396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.284489 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.284423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.385405 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.385312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.385405 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.385349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.385405 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.385375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.385405 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.385404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.385725 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.385428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kplb\" (UniqueName: \"kubernetes.io/projected/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-kube-api-access-8kplb\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.385725 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.385468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.385893 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.385851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.385959 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.385893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.386018 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.385978 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.387916 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.387884 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.388034 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.388016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.394191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.394165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kplb\" (UniqueName: \"kubernetes.io/projected/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-kube-api-access-8kplb\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.399094 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.399066 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:27.531848 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.531823 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld"] Apr 16 18:41:27.533657 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:41:27.533626 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10fb91fe_d5a5_4e5c_ade9_1c5b8db87a1e.slice/crio-57b97b0936dc6babede1f7d99c50b2046bb926a943ef615360d0323e0c075a8b WatchSource:0}: Error finding container 57b97b0936dc6babede1f7d99c50b2046bb926a943ef615360d0323e0c075a8b: Status 404 returned error can't find the container with id 57b97b0936dc6babede1f7d99c50b2046bb926a943ef615360d0323e0c075a8b Apr 16 18:41:27.535436 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.535419 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:41:27.764380 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.764341 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" event={"ID":"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e","Type":"ContainerStarted","Data":"4ddb66ca0696c640f668c8aa2700587839ad37bf4dde8f81998b829b5ae93e9f"} Apr 16 18:41:27.764380 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:27.764384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" event={"ID":"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e","Type":"ContainerStarted","Data":"57b97b0936dc6babede1f7d99c50b2046bb926a943ef615360d0323e0c075a8b"} Apr 16 18:41:28.309482 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:28.309440 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:28.312374 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:28.312351 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:28.333330 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:28.333299 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:28.781160 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:28.781123 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:41:32.785601 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:32.785564 2576 generic.go:358] "Generic (PLEG): container finished" podID="10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" containerID="4ddb66ca0696c640f668c8aa2700587839ad37bf4dde8f81998b829b5ae93e9f" exitCode=0 Apr 16 18:41:32.786170 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:32.785641 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" event={"ID":"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e","Type":"ContainerDied","Data":"4ddb66ca0696c640f668c8aa2700587839ad37bf4dde8f81998b829b5ae93e9f"} Apr 16 18:41:33.793936 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:33.793887 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" event={"ID":"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e","Type":"ContainerStarted","Data":"1d727dd9b68f410ce10fc011a1d111608ee54aa8d2c8b6d593d3cdba03a476fc"} Apr 16 18:41:33.821956 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:33.821890 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" podStartSLOduration=6.8218676 podStartE2EDuration="6.8218676s" podCreationTimestamp="2026-04-16 18:41:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:41:33.815485314 +0000 UTC m=+619.940012979" watchObservedRunningTime="2026-04-16 18:41:33.8218676 +0000 UTC m=+619.946395265" Apr 16 18:41:37.399372 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:37.399319 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:37.399372 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:37.399379 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:37.412149 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:37.412112 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:41:37.818964 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:41:37.818933 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:42:00.378191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.378151 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m"] Apr 16 18:42:00.378606 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.378421 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" podUID="36fa3c3c-b531-470b-9e94-f42c92def1e5" containerName="main" containerID="cri-o://9e67bd9207a2fd0a7883170cc948a763b5598c9e1e75adbac30c5364627d5687" gracePeriod=30 Apr 16 18:42:00.644563 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.644536 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:42:00.774737 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.774680 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-model-cache\") pod \"36fa3c3c-b531-470b-9e94-f42c92def1e5\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " Apr 16 18:42:00.774975 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.774787 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36fa3c3c-b531-470b-9e94-f42c92def1e5-tls-certs\") pod \"36fa3c3c-b531-470b-9e94-f42c92def1e5\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " Apr 16 18:42:00.774975 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.774861 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhccz\" (UniqueName: \"kubernetes.io/projected/36fa3c3c-b531-470b-9e94-f42c92def1e5-kube-api-access-zhccz\") pod \"36fa3c3c-b531-470b-9e94-f42c92def1e5\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " Apr 16 18:42:00.775105 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.774999 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-model-cache" (OuterVolumeSpecName: "model-cache") pod "36fa3c3c-b531-470b-9e94-f42c92def1e5" (UID: "36fa3c3c-b531-470b-9e94-f42c92def1e5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:00.775105 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.775039 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-home\") pod \"36fa3c3c-b531-470b-9e94-f42c92def1e5\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " Apr 16 18:42:00.775105 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.775082 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-kserve-provision-location\") pod \"36fa3c3c-b531-470b-9e94-f42c92def1e5\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " Apr 16 18:42:00.775251 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.775186 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-dshm\") pod \"36fa3c3c-b531-470b-9e94-f42c92def1e5\" (UID: \"36fa3c3c-b531-470b-9e94-f42c92def1e5\") " Apr 16 18:42:00.775325 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.775300 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-home" (OuterVolumeSpecName: "home") pod "36fa3c3c-b531-470b-9e94-f42c92def1e5" (UID: "36fa3c3c-b531-470b-9e94-f42c92def1e5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:00.775486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.775468 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:42:00.775549 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.775493 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:42:00.777056 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.777030 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fa3c3c-b531-470b-9e94-f42c92def1e5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "36fa3c3c-b531-470b-9e94-f42c92def1e5" (UID: "36fa3c3c-b531-470b-9e94-f42c92def1e5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:42:00.777206 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.777187 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-dshm" (OuterVolumeSpecName: "dshm") pod "36fa3c3c-b531-470b-9e94-f42c92def1e5" (UID: "36fa3c3c-b531-470b-9e94-f42c92def1e5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:00.777288 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.777271 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fa3c3c-b531-470b-9e94-f42c92def1e5-kube-api-access-zhccz" (OuterVolumeSpecName: "kube-api-access-zhccz") pod "36fa3c3c-b531-470b-9e94-f42c92def1e5" (UID: "36fa3c3c-b531-470b-9e94-f42c92def1e5"). InnerVolumeSpecName "kube-api-access-zhccz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:42:00.830682 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.830632 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "36fa3c3c-b531-470b-9e94-f42c92def1e5" (UID: "36fa3c3c-b531-470b-9e94-f42c92def1e5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:00.876630 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.876591 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:42:00.876630 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.876625 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/36fa3c3c-b531-470b-9e94-f42c92def1e5-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:42:00.876630 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.876635 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36fa3c3c-b531-470b-9e94-f42c92def1e5-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:42:00.876940 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.876651 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zhccz\" (UniqueName: \"kubernetes.io/projected/36fa3c3c-b531-470b-9e94-f42c92def1e5-kube-api-access-zhccz\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:42:00.909610 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.909511 2576 generic.go:358] "Generic (PLEG): container finished" podID="36fa3c3c-b531-470b-9e94-f42c92def1e5" containerID="9e67bd9207a2fd0a7883170cc948a763b5598c9e1e75adbac30c5364627d5687" exitCode=0 Apr 16 18:42:00.909610 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.909586 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" Apr 16 18:42:00.909610 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.909591 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" event={"ID":"36fa3c3c-b531-470b-9e94-f42c92def1e5","Type":"ContainerDied","Data":"9e67bd9207a2fd0a7883170cc948a763b5598c9e1e75adbac30c5364627d5687"} Apr 16 18:42:00.909921 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.909638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m" event={"ID":"36fa3c3c-b531-470b-9e94-f42c92def1e5","Type":"ContainerDied","Data":"0668c8a40f1302350d7e3a233f77cb2f9a6594675e2dc6d4a695ff3ae326d30b"} Apr 16 18:42:00.909921 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.909659 2576 scope.go:117] "RemoveContainer" containerID="9e67bd9207a2fd0a7883170cc948a763b5598c9e1e75adbac30c5364627d5687" Apr 16 18:42:00.919030 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.919009 2576 scope.go:117] "RemoveContainer" containerID="66b001b5ee99e1216e8b323d38a1e282082775b780f0587b58db5d64e1ecc967" Apr 16 18:42:00.929731 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.929710 2576 scope.go:117] "RemoveContainer" containerID="9e67bd9207a2fd0a7883170cc948a763b5598c9e1e75adbac30c5364627d5687" Apr 16 18:42:00.930062 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:42:00.930043 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e67bd9207a2fd0a7883170cc948a763b5598c9e1e75adbac30c5364627d5687\": container with ID starting with 9e67bd9207a2fd0a7883170cc948a763b5598c9e1e75adbac30c5364627d5687 not found: ID does not exist" containerID="9e67bd9207a2fd0a7883170cc948a763b5598c9e1e75adbac30c5364627d5687" Apr 16 18:42:00.930114 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.930072 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e67bd9207a2fd0a7883170cc948a763b5598c9e1e75adbac30c5364627d5687"} err="failed to get container status \"9e67bd9207a2fd0a7883170cc948a763b5598c9e1e75adbac30c5364627d5687\": rpc error: code = NotFound desc = could not find container \"9e67bd9207a2fd0a7883170cc948a763b5598c9e1e75adbac30c5364627d5687\": container with ID starting with 9e67bd9207a2fd0a7883170cc948a763b5598c9e1e75adbac30c5364627d5687 not found: ID does not exist" Apr 16 18:42:00.930114 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.930090 2576 scope.go:117] "RemoveContainer" containerID="66b001b5ee99e1216e8b323d38a1e282082775b780f0587b58db5d64e1ecc967" Apr 16 18:42:00.930344 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:42:00.930327 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66b001b5ee99e1216e8b323d38a1e282082775b780f0587b58db5d64e1ecc967\": container with ID starting with 66b001b5ee99e1216e8b323d38a1e282082775b780f0587b58db5d64e1ecc967 not found: ID does not exist" containerID="66b001b5ee99e1216e8b323d38a1e282082775b780f0587b58db5d64e1ecc967" Apr 16 18:42:00.930394 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.930353 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b001b5ee99e1216e8b323d38a1e282082775b780f0587b58db5d64e1ecc967"} err="failed to get container status \"66b001b5ee99e1216e8b323d38a1e282082775b780f0587b58db5d64e1ecc967\": rpc error: code = NotFound desc = could not find container \"66b001b5ee99e1216e8b323d38a1e282082775b780f0587b58db5d64e1ecc967\": container with ID starting with 66b001b5ee99e1216e8b323d38a1e282082775b780f0587b58db5d64e1ecc967 not found: ID does not exist" Apr 16 18:42:00.936281 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.936249 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m"] Apr 16 18:42:00.943190 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:00.943156 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-bpw8m"] Apr 16 18:42:02.506840 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:02.506808 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fa3c3c-b531-470b-9e94-f42c92def1e5" path="/var/lib/kubelet/pods/36fa3c3c-b531-470b-9e94-f42c92def1e5/volumes" Apr 16 18:42:10.568860 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.568826 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx"] Apr 16 18:42:10.569371 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.569335 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36fa3c3c-b531-470b-9e94-f42c92def1e5" containerName="storage-initializer" Apr 16 18:42:10.569371 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.569354 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fa3c3c-b531-470b-9e94-f42c92def1e5" containerName="storage-initializer" Apr 16 18:42:10.569500 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.569382 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36fa3c3c-b531-470b-9e94-f42c92def1e5" containerName="main" Apr 16 18:42:10.569500 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.569391 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fa3c3c-b531-470b-9e94-f42c92def1e5" containerName="main" Apr 16 18:42:10.569500 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.569488 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="36fa3c3c-b531-470b-9e94-f42c92def1e5" containerName="main" Apr 16 18:42:10.572816 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.572794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.575719 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.575696 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-5n2pf\"" Apr 16 18:42:10.576001 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.575734 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 18:42:10.602785 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.602723 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx"] Apr 16 18:42:10.663653 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.663617 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99dmg\" (UniqueName: \"kubernetes.io/projected/f6fd8233-115b-443f-a1cd-259cc2b21058-kube-api-access-99dmg\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.663861 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.663681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6fd8233-115b-443f-a1cd-259cc2b21058-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.663861 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.663716 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.663861 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.663752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.663861 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.663791 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.663861 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.663843 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.764537 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.764477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.764729 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.764593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.764729 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.764640 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99dmg\" (UniqueName: \"kubernetes.io/projected/f6fd8233-115b-443f-a1cd-259cc2b21058-kube-api-access-99dmg\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.764729 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.764691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6fd8233-115b-443f-a1cd-259cc2b21058-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.764909 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.764729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.764909 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.764794 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.765034 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.765015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.765071 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.765049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.765106 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.765070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.765142 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.765122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.767323 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.767300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6fd8233-115b-443f-a1cd-259cc2b21058-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.773482 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.773457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99dmg\" (UniqueName: \"kubernetes.io/projected/f6fd8233-115b-443f-a1cd-259cc2b21058-kube-api-access-99dmg\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:10.882466 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:10.882370 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:11.229874 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:11.229846 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx"] Apr 16 18:42:11.231575 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:42:11.231551 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6fd8233_115b_443f_a1cd_259cc2b21058.slice/crio-cb2f60b2359b8451a069f5221507c104697258142db5c912453a415656c70e63 WatchSource:0}: Error finding container cb2f60b2359b8451a069f5221507c104697258142db5c912453a415656c70e63: Status 404 returned error can't find the container with id cb2f60b2359b8451a069f5221507c104697258142db5c912453a415656c70e63 Apr 16 18:42:11.954439 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:11.954403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" event={"ID":"f6fd8233-115b-443f-a1cd-259cc2b21058","Type":"ContainerStarted","Data":"7ce1fd23e0720264215e1eb2c009b41cab3d6b001b5900a5f9816a1607e36ca0"} Apr 16 18:42:11.954439 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:11.954441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" event={"ID":"f6fd8233-115b-443f-a1cd-259cc2b21058","Type":"ContainerStarted","Data":"cb2f60b2359b8451a069f5221507c104697258142db5c912453a415656c70e63"} Apr 16 18:42:12.360479 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.360388 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld"] Apr 16 18:42:12.360710 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.360682 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" podUID="10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" containerName="main" containerID="cri-o://1d727dd9b68f410ce10fc011a1d111608ee54aa8d2c8b6d593d3cdba03a476fc" gracePeriod=30 Apr 16 18:42:12.618541 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.618468 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:42:12.681703 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.681667 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kplb\" (UniqueName: \"kubernetes.io/projected/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-kube-api-access-8kplb\") pod \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " Apr 16 18:42:12.681929 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.681722 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-home\") pod \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " Apr 16 18:42:12.681929 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.681772 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-tls-certs\") pod \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " Apr 16 18:42:12.681929 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.681818 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-dshm\") pod \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " Apr 16 18:42:12.681929 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.681850 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-model-cache\") pod \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " Apr 16 18:42:12.681929 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.681906 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-kserve-provision-location\") pod \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\" (UID: \"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e\") " Apr 16 18:42:12.682195 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.682011 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-home" (OuterVolumeSpecName: "home") pod "10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" (UID: "10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:12.682195 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.682181 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-model-cache" (OuterVolumeSpecName: "model-cache") pod "10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" (UID: "10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:12.682284 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.682204 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:42:12.683951 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.683918 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-dshm" (OuterVolumeSpecName: "dshm") pod "10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" (UID: "10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:12.684065 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.683985 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-kube-api-access-8kplb" (OuterVolumeSpecName: "kube-api-access-8kplb") pod "10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" (UID: "10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e"). InnerVolumeSpecName "kube-api-access-8kplb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:42:12.684160 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.684143 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" (UID: "10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:42:12.738605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.738559 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" (UID: "10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:12.783593 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.783558 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:42:12.783593 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.783587 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:42:12.783593 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.783596 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:42:12.783872 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.783606 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:42:12.783872 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.783617 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8kplb\" (UniqueName: \"kubernetes.io/projected/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e-kube-api-access-8kplb\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:42:12.959898 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.959863 2576 generic.go:358] "Generic (PLEG): container finished" podID="10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" containerID="1d727dd9b68f410ce10fc011a1d111608ee54aa8d2c8b6d593d3cdba03a476fc" exitCode=0 Apr 16 18:42:12.960352 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.959939 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" Apr 16 18:42:12.960352 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.959946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" event={"ID":"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e","Type":"ContainerDied","Data":"1d727dd9b68f410ce10fc011a1d111608ee54aa8d2c8b6d593d3cdba03a476fc"} Apr 16 18:42:12.960352 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.959983 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld" event={"ID":"10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e","Type":"ContainerDied","Data":"57b97b0936dc6babede1f7d99c50b2046bb926a943ef615360d0323e0c075a8b"} Apr 16 18:42:12.960352 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.959998 2576 scope.go:117] "RemoveContainer" containerID="1d727dd9b68f410ce10fc011a1d111608ee54aa8d2c8b6d593d3cdba03a476fc" Apr 16 18:42:12.961398 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.961371 2576 generic.go:358] "Generic (PLEG): container finished" podID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerID="7ce1fd23e0720264215e1eb2c009b41cab3d6b001b5900a5f9816a1607e36ca0" exitCode=0 Apr 16 18:42:12.961530 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.961450 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" event={"ID":"f6fd8233-115b-443f-a1cd-259cc2b21058","Type":"ContainerDied","Data":"7ce1fd23e0720264215e1eb2c009b41cab3d6b001b5900a5f9816a1607e36ca0"} Apr 16 18:42:12.969611 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.969592 2576 scope.go:117] "RemoveContainer" containerID="4ddb66ca0696c640f668c8aa2700587839ad37bf4dde8f81998b829b5ae93e9f" Apr 16 18:42:12.985505 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.985456 2576 scope.go:117] "RemoveContainer" containerID="1d727dd9b68f410ce10fc011a1d111608ee54aa8d2c8b6d593d3cdba03a476fc" Apr 16 18:42:12.985825 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:42:12.985804 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d727dd9b68f410ce10fc011a1d111608ee54aa8d2c8b6d593d3cdba03a476fc\": container with ID starting with 1d727dd9b68f410ce10fc011a1d111608ee54aa8d2c8b6d593d3cdba03a476fc not found: ID does not exist" containerID="1d727dd9b68f410ce10fc011a1d111608ee54aa8d2c8b6d593d3cdba03a476fc" Apr 16 18:42:12.985896 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.985839 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d727dd9b68f410ce10fc011a1d111608ee54aa8d2c8b6d593d3cdba03a476fc"} err="failed to get container status \"1d727dd9b68f410ce10fc011a1d111608ee54aa8d2c8b6d593d3cdba03a476fc\": rpc error: code = NotFound desc = could not find container \"1d727dd9b68f410ce10fc011a1d111608ee54aa8d2c8b6d593d3cdba03a476fc\": container with ID starting with 1d727dd9b68f410ce10fc011a1d111608ee54aa8d2c8b6d593d3cdba03a476fc not found: ID does not exist" Apr 16 18:42:12.985896 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.985859 2576 scope.go:117] "RemoveContainer" containerID="4ddb66ca0696c640f668c8aa2700587839ad37bf4dde8f81998b829b5ae93e9f" Apr 16 18:42:12.986172 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:42:12.986145 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ddb66ca0696c640f668c8aa2700587839ad37bf4dde8f81998b829b5ae93e9f\": container with ID starting with 4ddb66ca0696c640f668c8aa2700587839ad37bf4dde8f81998b829b5ae93e9f not found: ID does not exist" containerID="4ddb66ca0696c640f668c8aa2700587839ad37bf4dde8f81998b829b5ae93e9f" Apr 16 18:42:12.986255 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.986179 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ddb66ca0696c640f668c8aa2700587839ad37bf4dde8f81998b829b5ae93e9f"} err="failed to get container status \"4ddb66ca0696c640f668c8aa2700587839ad37bf4dde8f81998b829b5ae93e9f\": rpc error: code = NotFound desc = could not find container \"4ddb66ca0696c640f668c8aa2700587839ad37bf4dde8f81998b829b5ae93e9f\": container with ID starting with 4ddb66ca0696c640f668c8aa2700587839ad37bf4dde8f81998b829b5ae93e9f not found: ID does not exist" Apr 16 18:42:12.999388 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:12.999355 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld"] Apr 16 18:42:13.003846 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:13.003819 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-dc9cd8f8b-cw5ld"] Apr 16 18:42:13.968065 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:13.968033 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" event={"ID":"f6fd8233-115b-443f-a1cd-259cc2b21058","Type":"ContainerStarted","Data":"33272c22db652ae1e67dc699f5e5b6ae2190b47c846b193dd52f6e90e61e89dd"} Apr 16 18:42:14.510556 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:14.510119 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" path="/var/lib/kubelet/pods/10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e/volumes" Apr 16 18:42:20.679157 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.679118 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx"] Apr 16 18:42:20.679608 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.679588 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" containerName="storage-initializer" Apr 16 18:42:20.679608 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.679609 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" containerName="storage-initializer" Apr 16 18:42:20.679790 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.679620 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" containerName="main" Apr 16 18:42:20.679790 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.679627 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" containerName="main" Apr 16 18:42:20.679790 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.679716 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="10fb91fe-d5a5-4e5c-ade9-1c5b8db87a1e" containerName="main" Apr 16 18:42:20.683555 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.683533 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.686512 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.686484 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 18:42:20.694601 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.694458 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx"] Apr 16 18:42:20.759021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.758983 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.759204 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.759055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.759204 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.759095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.759204 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.759116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.759204 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.759140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/133cc38b-b9bb-45f9-bec5-1e40151f4310-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.759204 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.759170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrrwr\" (UniqueName: \"kubernetes.io/projected/133cc38b-b9bb-45f9-bec5-1e40151f4310-kube-api-access-qrrwr\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.860091 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.860055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.860305 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.860109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.860305 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.860139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.860305 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.860165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/133cc38b-b9bb-45f9-bec5-1e40151f4310-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.860305 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.860193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrrwr\" (UniqueName: \"kubernetes.io/projected/133cc38b-b9bb-45f9-bec5-1e40151f4310-kube-api-access-qrrwr\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.860305 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.860266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.860669 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.860559 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.860772 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.860682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.860881 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.860859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.862793 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.862762 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.863210 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.863187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/133cc38b-b9bb-45f9-bec5-1e40151f4310-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.875361 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.875327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrrwr\" (UniqueName: \"kubernetes.io/projected/133cc38b-b9bb-45f9-bec5-1e40151f4310-kube-api-access-qrrwr\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:20.997605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:20.997567 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:42:21.162507 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:21.162471 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx"] Apr 16 18:42:21.163431 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:42:21.163398 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod133cc38b_b9bb_45f9_bec5_1e40151f4310.slice/crio-b642cc6af030edb2b5caef53d4a27b647e68335b0a115b6fc82c488f9d472464 WatchSource:0}: Error finding container b642cc6af030edb2b5caef53d4a27b647e68335b0a115b6fc82c488f9d472464: Status 404 returned error can't find the container with id b642cc6af030edb2b5caef53d4a27b647e68335b0a115b6fc82c488f9d472464 Apr 16 18:42:22.009721 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:22.009679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" event={"ID":"133cc38b-b9bb-45f9-bec5-1e40151f4310","Type":"ContainerStarted","Data":"7d7a524a14d28ed17513cb93c3b845f377b65c4eba84654ded98ab8ecfe191db"} Apr 16 18:42:22.010532 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:22.009730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" event={"ID":"133cc38b-b9bb-45f9-bec5-1e40151f4310","Type":"ContainerStarted","Data":"b642cc6af030edb2b5caef53d4a27b647e68335b0a115b6fc82c488f9d472464"} Apr 16 18:42:26.125205 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:42:26.125162 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod133cc38b_b9bb_45f9_bec5_1e40151f4310.slice/crio-conmon-7d7a524a14d28ed17513cb93c3b845f377b65c4eba84654ded98ab8ecfe191db.scope\": RecentStats: unable to find data in memory cache]" Apr 16 18:42:27.038472 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:27.038432 2576 generic.go:358] "Generic (PLEG): container finished" podID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerID="7d7a524a14d28ed17513cb93c3b845f377b65c4eba84654ded98ab8ecfe191db" exitCode=0 Apr 16 18:42:27.038636 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:27.038496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" event={"ID":"133cc38b-b9bb-45f9-bec5-1e40151f4310","Type":"ContainerDied","Data":"7d7a524a14d28ed17513cb93c3b845f377b65c4eba84654ded98ab8ecfe191db"} Apr 16 18:42:55.172454 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:55.172410 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" event={"ID":"f6fd8233-115b-443f-a1cd-259cc2b21058","Type":"ContainerStarted","Data":"c6713f86411eb593356ce69814ea1e44120108d2585957d2b8522cefc71feca6"} Apr 16 18:42:55.172993 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:55.172867 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:42:55.175452 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:55.175414 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 18:42:55.197724 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:55.197666 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" podStartSLOduration=4.043332399 podStartE2EDuration="45.19765177s" podCreationTimestamp="2026-04-16 18:42:10 +0000 UTC" firstStartedPulling="2026-04-16 18:42:12.96280057 +0000 UTC m=+659.087328228" lastFinishedPulling="2026-04-16 18:42:54.117119944 +0000 UTC m=+700.241647599" observedRunningTime="2026-04-16 18:42:55.195732947 +0000 UTC m=+701.320260612" watchObservedRunningTime="2026-04-16 18:42:55.19765177 +0000 UTC m=+701.322179451" Apr 16 18:42:56.178438 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:42:56.178396 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 18:43:00.883058 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:00.883020 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:43:00.883058 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:00.883064 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:43:00.883811 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:00.883410 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.43:8082/healthz\": dial tcp 10.134.0.43:8082: connect: connection refused" Apr 16 18:43:00.884669 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:00.884626 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 18:43:10.884787 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:10.884727 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 18:43:10.885384 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:10.885362 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:43:10.886572 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:10.886547 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 18:43:10.886695 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:10.886669 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:43:11.245815 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:11.245776 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 18:43:13.253710 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:13.253674 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" event={"ID":"133cc38b-b9bb-45f9-bec5-1e40151f4310","Type":"ContainerStarted","Data":"8772b667dec6ebbd059008bb1083dfa2c289361329fe83c55ccda7507f926298"} Apr 16 18:43:13.279619 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:13.279564 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" podStartSLOduration=7.825886281 podStartE2EDuration="53.279548216s" podCreationTimestamp="2026-04-16 18:42:20 +0000 UTC" firstStartedPulling="2026-04-16 18:42:27.039777647 +0000 UTC m=+673.164305294" lastFinishedPulling="2026-04-16 18:43:12.493439588 +0000 UTC m=+718.617967229" observedRunningTime="2026-04-16 18:43:13.276947441 +0000 UTC m=+719.401475094" watchObservedRunningTime="2026-04-16 18:43:13.279548216 +0000 UTC m=+719.404075889" Apr 16 18:43:18.614534 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:18.614494 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx"] Apr 16 18:43:18.615091 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:18.614863 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="main" containerID="cri-o://33272c22db652ae1e67dc699f5e5b6ae2190b47c846b193dd52f6e90e61e89dd" gracePeriod=30 Apr 16 18:43:18.615091 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:18.614869 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="tokenizer" containerID="cri-o://c6713f86411eb593356ce69814ea1e44120108d2585957d2b8522cefc71feca6" gracePeriod=30 Apr 16 18:43:18.616813 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:18.616782 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 18:43:19.279186 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:19.279154 2576 generic.go:358] "Generic (PLEG): container finished" podID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerID="33272c22db652ae1e67dc699f5e5b6ae2190b47c846b193dd52f6e90e61e89dd" exitCode=0 Apr 16 18:43:19.279353 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:19.279223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" event={"ID":"f6fd8233-115b-443f-a1cd-259cc2b21058","Type":"ContainerDied","Data":"33272c22db652ae1e67dc699f5e5b6ae2190b47c846b193dd52f6e90e61e89dd"} Apr 16 18:43:20.101852 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.101830 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:43:20.137455 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.137421 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-kserve-provision-location\") pod \"f6fd8233-115b-443f-a1cd-259cc2b21058\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " Apr 16 18:43:20.137642 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.137493 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-tmp\") pod \"f6fd8233-115b-443f-a1cd-259cc2b21058\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " Apr 16 18:43:20.137642 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.137524 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-uds\") pod \"f6fd8233-115b-443f-a1cd-259cc2b21058\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " Apr 16 18:43:20.137642 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.137551 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-cache\") pod \"f6fd8233-115b-443f-a1cd-259cc2b21058\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " Apr 16 18:43:20.137642 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.137593 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99dmg\" (UniqueName: \"kubernetes.io/projected/f6fd8233-115b-443f-a1cd-259cc2b21058-kube-api-access-99dmg\") pod \"f6fd8233-115b-443f-a1cd-259cc2b21058\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " Apr 16 18:43:20.137887 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.137699 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6fd8233-115b-443f-a1cd-259cc2b21058-tls-certs\") pod \"f6fd8233-115b-443f-a1cd-259cc2b21058\" (UID: \"f6fd8233-115b-443f-a1cd-259cc2b21058\") " Apr 16 18:43:20.137887 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.137797 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f6fd8233-115b-443f-a1cd-259cc2b21058" (UID: "f6fd8233-115b-443f-a1cd-259cc2b21058"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:20.137999 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.137895 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f6fd8233-115b-443f-a1cd-259cc2b21058" (UID: "f6fd8233-115b-443f-a1cd-259cc2b21058"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:20.138054 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.138007 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-tmp\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:43:20.138054 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.138027 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-uds\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:43:20.138244 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.138211 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f6fd8233-115b-443f-a1cd-259cc2b21058" (UID: "f6fd8233-115b-443f-a1cd-259cc2b21058"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:20.138678 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.138648 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f6fd8233-115b-443f-a1cd-259cc2b21058" (UID: "f6fd8233-115b-443f-a1cd-259cc2b21058"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:20.140082 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.140046 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6fd8233-115b-443f-a1cd-259cc2b21058-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f6fd8233-115b-443f-a1cd-259cc2b21058" (UID: "f6fd8233-115b-443f-a1cd-259cc2b21058"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:43:20.140185 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.140108 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6fd8233-115b-443f-a1cd-259cc2b21058-kube-api-access-99dmg" (OuterVolumeSpecName: "kube-api-access-99dmg") pod "f6fd8233-115b-443f-a1cd-259cc2b21058" (UID: "f6fd8233-115b-443f-a1cd-259cc2b21058"). InnerVolumeSpecName "kube-api-access-99dmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:43:20.238725 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.238692 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6fd8233-115b-443f-a1cd-259cc2b21058-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:43:20.238725 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.238722 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:43:20.238946 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.238735 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6fd8233-115b-443f-a1cd-259cc2b21058-tokenizer-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:43:20.238946 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.238771 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-99dmg\" (UniqueName: \"kubernetes.io/projected/f6fd8233-115b-443f-a1cd-259cc2b21058-kube-api-access-99dmg\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:43:20.284446 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.284414 2576 generic.go:358] "Generic (PLEG): container finished" podID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerID="c6713f86411eb593356ce69814ea1e44120108d2585957d2b8522cefc71feca6" exitCode=0 Apr 16 18:43:20.284637 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.284488 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" Apr 16 18:43:20.284637 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.284498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" event={"ID":"f6fd8233-115b-443f-a1cd-259cc2b21058","Type":"ContainerDied","Data":"c6713f86411eb593356ce69814ea1e44120108d2585957d2b8522cefc71feca6"} Apr 16 18:43:20.284637 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.284548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx" event={"ID":"f6fd8233-115b-443f-a1cd-259cc2b21058","Type":"ContainerDied","Data":"cb2f60b2359b8451a069f5221507c104697258142db5c912453a415656c70e63"} Apr 16 18:43:20.284637 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.284570 2576 scope.go:117] "RemoveContainer" containerID="c6713f86411eb593356ce69814ea1e44120108d2585957d2b8522cefc71feca6" Apr 16 18:43:20.295154 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.295129 2576 scope.go:117] "RemoveContainer" containerID="33272c22db652ae1e67dc699f5e5b6ae2190b47c846b193dd52f6e90e61e89dd" Apr 16 18:43:20.304044 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.304022 2576 scope.go:117] "RemoveContainer" containerID="7ce1fd23e0720264215e1eb2c009b41cab3d6b001b5900a5f9816a1607e36ca0" Apr 16 18:43:20.309029 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.308998 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx"] Apr 16 18:43:20.313445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.313403 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9x65xx"] Apr 16 18:43:20.316691 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.316666 2576 scope.go:117] "RemoveContainer" containerID="c6713f86411eb593356ce69814ea1e44120108d2585957d2b8522cefc71feca6" Apr 16 18:43:20.317202 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:43:20.317182 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6713f86411eb593356ce69814ea1e44120108d2585957d2b8522cefc71feca6\": container with ID starting with c6713f86411eb593356ce69814ea1e44120108d2585957d2b8522cefc71feca6 not found: ID does not exist" containerID="c6713f86411eb593356ce69814ea1e44120108d2585957d2b8522cefc71feca6" Apr 16 18:43:20.317298 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.317214 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6713f86411eb593356ce69814ea1e44120108d2585957d2b8522cefc71feca6"} err="failed to get container status \"c6713f86411eb593356ce69814ea1e44120108d2585957d2b8522cefc71feca6\": rpc error: code = NotFound desc = could not find container \"c6713f86411eb593356ce69814ea1e44120108d2585957d2b8522cefc71feca6\": container with ID starting with c6713f86411eb593356ce69814ea1e44120108d2585957d2b8522cefc71feca6 not found: ID does not exist" Apr 16 18:43:20.317298 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.317239 2576 scope.go:117] "RemoveContainer" containerID="33272c22db652ae1e67dc699f5e5b6ae2190b47c846b193dd52f6e90e61e89dd" Apr 16 18:43:20.317722 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:43:20.317702 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33272c22db652ae1e67dc699f5e5b6ae2190b47c846b193dd52f6e90e61e89dd\": container with ID starting with 33272c22db652ae1e67dc699f5e5b6ae2190b47c846b193dd52f6e90e61e89dd not found: ID does not exist" containerID="33272c22db652ae1e67dc699f5e5b6ae2190b47c846b193dd52f6e90e61e89dd" Apr 16 18:43:20.317804 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.317727 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33272c22db652ae1e67dc699f5e5b6ae2190b47c846b193dd52f6e90e61e89dd"} err="failed to get container status \"33272c22db652ae1e67dc699f5e5b6ae2190b47c846b193dd52f6e90e61e89dd\": rpc error: code = NotFound desc = could not find container \"33272c22db652ae1e67dc699f5e5b6ae2190b47c846b193dd52f6e90e61e89dd\": container with ID starting with 33272c22db652ae1e67dc699f5e5b6ae2190b47c846b193dd52f6e90e61e89dd not found: ID does not exist" Apr 16 18:43:20.317804 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.317777 2576 scope.go:117] "RemoveContainer" containerID="7ce1fd23e0720264215e1eb2c009b41cab3d6b001b5900a5f9816a1607e36ca0" Apr 16 18:43:20.318091 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:43:20.318069 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ce1fd23e0720264215e1eb2c009b41cab3d6b001b5900a5f9816a1607e36ca0\": container with ID starting with 7ce1fd23e0720264215e1eb2c009b41cab3d6b001b5900a5f9816a1607e36ca0 not found: ID does not exist" containerID="7ce1fd23e0720264215e1eb2c009b41cab3d6b001b5900a5f9816a1607e36ca0" Apr 16 18:43:20.318139 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.318096 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ce1fd23e0720264215e1eb2c009b41cab3d6b001b5900a5f9816a1607e36ca0"} err="failed to get container status \"7ce1fd23e0720264215e1eb2c009b41cab3d6b001b5900a5f9816a1607e36ca0\": rpc error: code = NotFound desc = could not find container \"7ce1fd23e0720264215e1eb2c009b41cab3d6b001b5900a5f9816a1607e36ca0\": container with ID starting with 7ce1fd23e0720264215e1eb2c009b41cab3d6b001b5900a5f9816a1607e36ca0 not found: ID does not exist" Apr 16 18:43:20.507666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.507627 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" path="/var/lib/kubelet/pods/f6fd8233-115b-443f-a1cd-259cc2b21058/volumes" Apr 16 18:43:20.998459 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.998424 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:43:20.998641 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:20.998519 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:43:21.000197 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:21.000168 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 18:43:26.466390 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.466359 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr"] Apr 16 18:43:26.466824 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.466702 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="tokenizer" Apr 16 18:43:26.466824 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.466713 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="tokenizer" Apr 16 18:43:26.466824 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.466722 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="storage-initializer" Apr 16 18:43:26.466824 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.466727 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="storage-initializer" Apr 16 18:43:26.466824 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.466758 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="main" Apr 16 18:43:26.466824 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.466764 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="main" Apr 16 18:43:26.466824 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.466823 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="main" Apr 16 18:43:26.467101 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.466835 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6fd8233-115b-443f-a1cd-259cc2b21058" containerName="tokenizer" Apr 16 18:43:26.474103 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.474078 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.478364 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.478333 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 18:43:26.480246 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.480217 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr"] Apr 16 18:43:26.598299 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.598260 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-dshm\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.598299 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.598305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thxqm\" (UniqueName: \"kubernetes.io/projected/8fccba01-cd11-4268-8e35-cbfaa221800d-kube-api-access-thxqm\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.598542 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.598333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-home\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.598542 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.598411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fccba01-cd11-4268-8e35-cbfaa221800d-tls-certs\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.598542 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.598437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-model-cache\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.598542 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.598475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.698962 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.698923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fccba01-cd11-4268-8e35-cbfaa221800d-tls-certs\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.699140 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.698971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-model-cache\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.699140 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.699017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.699140 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.699054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-dshm\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.699140 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.699084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thxqm\" (UniqueName: \"kubernetes.io/projected/8fccba01-cd11-4268-8e35-cbfaa221800d-kube-api-access-thxqm\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.699140 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.699109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-home\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.699521 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.699491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.699600 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.699497 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-model-cache\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.699600 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.699552 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-home\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.701837 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.701808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-dshm\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.702167 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.702145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fccba01-cd11-4268-8e35-cbfaa221800d-tls-certs\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.714695 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.714654 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thxqm\" (UniqueName: \"kubernetes.io/projected/8fccba01-cd11-4268-8e35-cbfaa221800d-kube-api-access-thxqm\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-xzfpr\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.785676 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.785574 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:26.926101 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:26.926075 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr"] Apr 16 18:43:26.928451 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:43:26.928418 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fccba01_cd11_4268_8e35_cbfaa221800d.slice/crio-bd2b8a4114a119dae0b843b4bcec7a5662ebe1a3dbb64dd153a88b51f05b42af WatchSource:0}: Error finding container bd2b8a4114a119dae0b843b4bcec7a5662ebe1a3dbb64dd153a88b51f05b42af: Status 404 returned error can't find the container with id bd2b8a4114a119dae0b843b4bcec7a5662ebe1a3dbb64dd153a88b51f05b42af Apr 16 18:43:27.319009 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:27.318966 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" event={"ID":"8fccba01-cd11-4268-8e35-cbfaa221800d","Type":"ContainerStarted","Data":"3ec2034f1b83ea9d59bdc6f1b10b5854dc219b9791c9fb733ff92763ab68bf75"} Apr 16 18:43:27.319009 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:27.319014 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" event={"ID":"8fccba01-cd11-4268-8e35-cbfaa221800d","Type":"ContainerStarted","Data":"bd2b8a4114a119dae0b843b4bcec7a5662ebe1a3dbb64dd153a88b51f05b42af"} Apr 16 18:43:30.998153 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:30.998112 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 18:43:32.339202 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:32.339166 2576 generic.go:358] "Generic (PLEG): container finished" podID="8fccba01-cd11-4268-8e35-cbfaa221800d" containerID="3ec2034f1b83ea9d59bdc6f1b10b5854dc219b9791c9fb733ff92763ab68bf75" exitCode=0 Apr 16 18:43:32.339618 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:32.339239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" event={"ID":"8fccba01-cd11-4268-8e35-cbfaa221800d","Type":"ContainerDied","Data":"3ec2034f1b83ea9d59bdc6f1b10b5854dc219b9791c9fb733ff92763ab68bf75"} Apr 16 18:43:33.344758 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:33.344714 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" event={"ID":"8fccba01-cd11-4268-8e35-cbfaa221800d","Type":"ContainerStarted","Data":"cbb96a193de1fbbd90db8fc0872d96ddb1f852814d798af7968cfa5891c49f43"} Apr 16 18:43:33.367932 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:33.367847 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" podStartSLOduration=7.367828797 podStartE2EDuration="7.367828797s" podCreationTimestamp="2026-04-16 18:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:43:33.366784906 +0000 UTC m=+739.491312569" watchObservedRunningTime="2026-04-16 18:43:33.367828797 +0000 UTC m=+739.492356460" Apr 16 18:43:36.786278 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:36.786237 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:36.786278 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:36.786286 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:36.800061 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:36.800031 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:37.371445 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:37.371410 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:43:40.998644 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:40.998605 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 18:43:50.998845 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:50.998798 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 18:43:59.935393 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:59.935357 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr"] Apr 16 18:43:59.935984 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:43:59.935771 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" podUID="8fccba01-cd11-4268-8e35-cbfaa221800d" containerName="main" containerID="cri-o://cbb96a193de1fbbd90db8fc0872d96ddb1f852814d798af7968cfa5891c49f43" gracePeriod=30 Apr 16 18:44:00.190357 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.190282 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:44:00.216871 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.216839 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-home\") pod \"8fccba01-cd11-4268-8e35-cbfaa221800d\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " Apr 16 18:44:00.217071 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.216913 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-kserve-provision-location\") pod \"8fccba01-cd11-4268-8e35-cbfaa221800d\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " Apr 16 18:44:00.217071 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.216944 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-dshm\") pod \"8fccba01-cd11-4268-8e35-cbfaa221800d\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " Apr 16 18:44:00.217071 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.217041 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-model-cache\") pod \"8fccba01-cd11-4268-8e35-cbfaa221800d\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " Apr 16 18:44:00.217248 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.217087 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thxqm\" (UniqueName: \"kubernetes.io/projected/8fccba01-cd11-4268-8e35-cbfaa221800d-kube-api-access-thxqm\") pod \"8fccba01-cd11-4268-8e35-cbfaa221800d\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " Apr 16 18:44:00.217248 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.217156 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fccba01-cd11-4268-8e35-cbfaa221800d-tls-certs\") pod \"8fccba01-cd11-4268-8e35-cbfaa221800d\" (UID: \"8fccba01-cd11-4268-8e35-cbfaa221800d\") " Apr 16 18:44:00.217248 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.217159 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-home" (OuterVolumeSpecName: "home") pod "8fccba01-cd11-4268-8e35-cbfaa221800d" (UID: "8fccba01-cd11-4268-8e35-cbfaa221800d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:00.217516 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.217436 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:44:00.217645 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.217625 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-model-cache" (OuterVolumeSpecName: "model-cache") pod "8fccba01-cd11-4268-8e35-cbfaa221800d" (UID: "8fccba01-cd11-4268-8e35-cbfaa221800d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:00.221129 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.221085 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-dshm" (OuterVolumeSpecName: "dshm") pod "8fccba01-cd11-4268-8e35-cbfaa221800d" (UID: "8fccba01-cd11-4268-8e35-cbfaa221800d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:00.222270 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.222241 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fccba01-cd11-4268-8e35-cbfaa221800d-kube-api-access-thxqm" (OuterVolumeSpecName: "kube-api-access-thxqm") pod "8fccba01-cd11-4268-8e35-cbfaa221800d" (UID: "8fccba01-cd11-4268-8e35-cbfaa221800d"). InnerVolumeSpecName "kube-api-access-thxqm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:44:00.222886 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.222776 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fccba01-cd11-4268-8e35-cbfaa221800d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8fccba01-cd11-4268-8e35-cbfaa221800d" (UID: "8fccba01-cd11-4268-8e35-cbfaa221800d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:44:00.286822 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.286775 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8fccba01-cd11-4268-8e35-cbfaa221800d" (UID: "8fccba01-cd11-4268-8e35-cbfaa221800d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:00.317990 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.317950 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:44:00.317990 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.317983 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:44:00.317990 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.317998 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fccba01-cd11-4268-8e35-cbfaa221800d-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:44:00.318221 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.318011 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-thxqm\" (UniqueName: \"kubernetes.io/projected/8fccba01-cd11-4268-8e35-cbfaa221800d-kube-api-access-thxqm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:44:00.318221 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.318023 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fccba01-cd11-4268-8e35-cbfaa221800d-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:44:00.457088 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.456989 2576 generic.go:358] "Generic (PLEG): container finished" podID="8fccba01-cd11-4268-8e35-cbfaa221800d" containerID="cbb96a193de1fbbd90db8fc0872d96ddb1f852814d798af7968cfa5891c49f43" exitCode=0 Apr 16 18:44:00.457088 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.457059 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" Apr 16 18:44:00.457088 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.457075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" event={"ID":"8fccba01-cd11-4268-8e35-cbfaa221800d","Type":"ContainerDied","Data":"cbb96a193de1fbbd90db8fc0872d96ddb1f852814d798af7968cfa5891c49f43"} Apr 16 18:44:00.457365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.457121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr" event={"ID":"8fccba01-cd11-4268-8e35-cbfaa221800d","Type":"ContainerDied","Data":"bd2b8a4114a119dae0b843b4bcec7a5662ebe1a3dbb64dd153a88b51f05b42af"} Apr 16 18:44:00.457365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.457141 2576 scope.go:117] "RemoveContainer" containerID="cbb96a193de1fbbd90db8fc0872d96ddb1f852814d798af7968cfa5891c49f43" Apr 16 18:44:00.466735 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.466714 2576 scope.go:117] "RemoveContainer" containerID="3ec2034f1b83ea9d59bdc6f1b10b5854dc219b9791c9fb733ff92763ab68bf75" Apr 16 18:44:00.484496 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.484456 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr"] Apr 16 18:44:00.490575 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.487247 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-xzfpr"] Apr 16 18:44:00.490852 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.490822 2576 scope.go:117] "RemoveContainer" containerID="cbb96a193de1fbbd90db8fc0872d96ddb1f852814d798af7968cfa5891c49f43" Apr 16 18:44:00.491296 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:44:00.491272 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb96a193de1fbbd90db8fc0872d96ddb1f852814d798af7968cfa5891c49f43\": container with ID starting with cbb96a193de1fbbd90db8fc0872d96ddb1f852814d798af7968cfa5891c49f43 not found: ID does not exist" containerID="cbb96a193de1fbbd90db8fc0872d96ddb1f852814d798af7968cfa5891c49f43" Apr 16 18:44:00.491398 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.491309 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb96a193de1fbbd90db8fc0872d96ddb1f852814d798af7968cfa5891c49f43"} err="failed to get container status \"cbb96a193de1fbbd90db8fc0872d96ddb1f852814d798af7968cfa5891c49f43\": rpc error: code = NotFound desc = could not find container \"cbb96a193de1fbbd90db8fc0872d96ddb1f852814d798af7968cfa5891c49f43\": container with ID starting with cbb96a193de1fbbd90db8fc0872d96ddb1f852814d798af7968cfa5891c49f43 not found: ID does not exist" Apr 16 18:44:00.491398 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.491336 2576 scope.go:117] "RemoveContainer" containerID="3ec2034f1b83ea9d59bdc6f1b10b5854dc219b9791c9fb733ff92763ab68bf75" Apr 16 18:44:00.491699 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:44:00.491674 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ec2034f1b83ea9d59bdc6f1b10b5854dc219b9791c9fb733ff92763ab68bf75\": container with ID starting with 3ec2034f1b83ea9d59bdc6f1b10b5854dc219b9791c9fb733ff92763ab68bf75 not found: ID does not exist" containerID="3ec2034f1b83ea9d59bdc6f1b10b5854dc219b9791c9fb733ff92763ab68bf75" Apr 16 18:44:00.491809 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.491709 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec2034f1b83ea9d59bdc6f1b10b5854dc219b9791c9fb733ff92763ab68bf75"} err="failed to get container status \"3ec2034f1b83ea9d59bdc6f1b10b5854dc219b9791c9fb733ff92763ab68bf75\": rpc error: code = NotFound desc = could not find container \"3ec2034f1b83ea9d59bdc6f1b10b5854dc219b9791c9fb733ff92763ab68bf75\": container with ID starting with 3ec2034f1b83ea9d59bdc6f1b10b5854dc219b9791c9fb733ff92763ab68bf75 not found: ID does not exist" Apr 16 18:44:00.507243 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.507200 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fccba01-cd11-4268-8e35-cbfaa221800d" path="/var/lib/kubelet/pods/8fccba01-cd11-4268-8e35-cbfaa221800d/volumes" Apr 16 18:44:00.998965 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:00.998925 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 18:44:04.689202 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.689158 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs"] Apr 16 18:44:04.689699 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.689632 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8fccba01-cd11-4268-8e35-cbfaa221800d" containerName="main" Apr 16 18:44:04.689699 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.689650 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fccba01-cd11-4268-8e35-cbfaa221800d" containerName="main" Apr 16 18:44:04.689699 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.689659 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8fccba01-cd11-4268-8e35-cbfaa221800d" containerName="storage-initializer" Apr 16 18:44:04.689699 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.689665 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fccba01-cd11-4268-8e35-cbfaa221800d" containerName="storage-initializer" Apr 16 18:44:04.689920 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.689737 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8fccba01-cd11-4268-8e35-cbfaa221800d" containerName="main" Apr 16 18:44:04.694888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.694854 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.697932 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.697908 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 18:44:04.706415 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.706334 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs"] Apr 16 18:44:04.762227 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.762178 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2l6r\" (UniqueName: \"kubernetes.io/projected/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-kube-api-access-m2l6r\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.762416 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.762238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-model-cache\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.762416 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.762266 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-dshm\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.762416 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.762301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-home\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.762416 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.762377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-kserve-provision-location\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.762578 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.762437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.863780 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.863730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2l6r\" (UniqueName: \"kubernetes.io/projected/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-kube-api-access-m2l6r\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.863978 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.863791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-model-cache\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.863978 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.863821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-dshm\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.863978 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.863850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-home\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.863978 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.863892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-kserve-provision-location\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.863978 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.863948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.864297 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.864270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-model-cache\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.864365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.864272 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-home\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.864416 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.864359 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-kserve-provision-location\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.866393 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.866370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-dshm\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.866666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.866643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:04.871711 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:04.871684 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2l6r\" (UniqueName: \"kubernetes.io/projected/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-kube-api-access-m2l6r\") pod \"stop-feature-test-kserve-85568b7f4f-fbdfs\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:05.009466 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.009149 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:05.009633 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.009596 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c"] Apr 16 18:44:05.014841 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.014816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.017824 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.017797 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-nbxcx\"" Apr 16 18:44:05.025060 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.025027 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c"] Apr 16 18:44:05.066924 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.066892 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.067150 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.067129 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91c99df0-8354-46fd-b778-fc6b37fcd877-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.067330 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.067308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.067477 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.067459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsxsv\" (UniqueName: \"kubernetes.io/projected/91c99df0-8354-46fd-b778-fc6b37fcd877-kube-api-access-nsxsv\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.067638 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.067619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.067813 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.067792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.160039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.160006 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs"] Apr 16 18:44:05.162093 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:44:05.162062 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1407a00a_ee0d_4dbd_a4ef_266fa7c7fc8f.slice/crio-be1dc19d3f27df60fae2d995114421cd36c5ea3301db8d7cade6fdc59bf46a8c WatchSource:0}: Error finding container be1dc19d3f27df60fae2d995114421cd36c5ea3301db8d7cade6fdc59bf46a8c: Status 404 returned error can't find the container with id be1dc19d3f27df60fae2d995114421cd36c5ea3301db8d7cade6fdc59bf46a8c Apr 16 18:44:05.168776 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.168727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.168874 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.168818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.168874 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.168845 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91c99df0-8354-46fd-b778-fc6b37fcd877-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.168989 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.168882 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.168989 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.168914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsxsv\" (UniqueName: \"kubernetes.io/projected/91c99df0-8354-46fd-b778-fc6b37fcd877-kube-api-access-nsxsv\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.168989 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.168954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.169173 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.169152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.169236 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.169195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.169236 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.169191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.169523 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.169503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.171490 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.171466 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91c99df0-8354-46fd-b778-fc6b37fcd877-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.181187 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.181166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsxsv\" (UniqueName: \"kubernetes.io/projected/91c99df0-8354-46fd-b778-fc6b37fcd877-kube-api-access-nsxsv\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.348603 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.348497 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:05.485915 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.485878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" event={"ID":"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f","Type":"ContainerStarted","Data":"fea347aad9629974d7918f06a6d6c39510382f806552804a4b1187dcec6363e4"} Apr 16 18:44:05.486111 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.485925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" event={"ID":"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f","Type":"ContainerStarted","Data":"be1dc19d3f27df60fae2d995114421cd36c5ea3301db8d7cade6fdc59bf46a8c"} Apr 16 18:44:05.495526 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:05.495499 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c"] Apr 16 18:44:05.497478 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:44:05.497447 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91c99df0_8354_46fd_b778_fc6b37fcd877.slice/crio-e5e32d3899022f4bb77593f22f8ee9b503e4bd24749aa7cc9c316dcf9d9f6899 WatchSource:0}: Error finding container e5e32d3899022f4bb77593f22f8ee9b503e4bd24749aa7cc9c316dcf9d9f6899: Status 404 returned error can't find the container with id e5e32d3899022f4bb77593f22f8ee9b503e4bd24749aa7cc9c316dcf9d9f6899 Apr 16 18:44:06.492529 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:06.492485 2576 generic.go:358] "Generic (PLEG): container finished" podID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerID="b792daeaf35af591e403ad5f6f96db685eb1c4cdff6f69a4d45dc4bcc1d9bbd6" exitCode=0 Apr 16 18:44:06.493232 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:06.492616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" event={"ID":"91c99df0-8354-46fd-b778-fc6b37fcd877","Type":"ContainerDied","Data":"b792daeaf35af591e403ad5f6f96db685eb1c4cdff6f69a4d45dc4bcc1d9bbd6"} Apr 16 18:44:06.493232 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:06.492663 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" event={"ID":"91c99df0-8354-46fd-b778-fc6b37fcd877","Type":"ContainerStarted","Data":"e5e32d3899022f4bb77593f22f8ee9b503e4bd24749aa7cc9c316dcf9d9f6899"} Apr 16 18:44:07.498423 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:07.498378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" event={"ID":"91c99df0-8354-46fd-b778-fc6b37fcd877","Type":"ContainerStarted","Data":"06203ae3f453099da4bff9a441de3784bea8fa07623992b76808d2b286cb4075"} Apr 16 18:44:07.498423 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:07.498430 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" event={"ID":"91c99df0-8354-46fd-b778-fc6b37fcd877","Type":"ContainerStarted","Data":"bf77f7bd98a582ac6d255476c65928ac26e0fb170dafa48d5c13137a4f38e9dc"} Apr 16 18:44:07.499029 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:07.498497 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:07.522670 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:07.522624 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" podStartSLOduration=3.522605032 podStartE2EDuration="3.522605032s" podCreationTimestamp="2026-04-16 18:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:44:07.521021372 +0000 UTC m=+773.645549037" watchObservedRunningTime="2026-04-16 18:44:07.522605032 +0000 UTC m=+773.647132697" Apr 16 18:44:10.515159 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:10.515124 2576 generic.go:358] "Generic (PLEG): container finished" podID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerID="fea347aad9629974d7918f06a6d6c39510382f806552804a4b1187dcec6363e4" exitCode=0 Apr 16 18:44:10.515545 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:10.515199 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" event={"ID":"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f","Type":"ContainerDied","Data":"fea347aad9629974d7918f06a6d6c39510382f806552804a4b1187dcec6363e4"} Apr 16 18:44:10.998735 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:10.998680 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 18:44:11.522851 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:11.522814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" event={"ID":"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f","Type":"ContainerStarted","Data":"af02c6daa4c715ee59410fe05c023d1ad8bb4942e541231bf3068d6797eed757"} Apr 16 18:44:11.550888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:11.550815 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" podStartSLOduration=7.550792507 podStartE2EDuration="7.550792507s" podCreationTimestamp="2026-04-16 18:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:44:11.545266899 +0000 UTC m=+777.669794562" watchObservedRunningTime="2026-04-16 18:44:11.550792507 +0000 UTC m=+777.675320171" Apr 16 18:44:15.010377 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:15.010332 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:15.010817 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:15.010392 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:44:15.011849 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:15.011805 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 18:44:15.349684 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:15.349578 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:15.349684 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:15.349633 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:15.352465 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:15.352440 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:15.544729 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:15.544694 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:20.998520 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:20.998472 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 18:44:25.010358 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:25.010313 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 18:44:30.998925 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:30.998869 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 18:44:35.010487 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:35.010388 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 18:44:36.549348 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:36.549312 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:44:40.998048 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:40.998001 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 16 18:44:45.010260 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:45.010212 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 18:44:51.007790 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:51.007728 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:44:51.015469 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:51.015441 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:44:55.010386 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:55.010338 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 18:44:57.373600 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:57.373563 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx"] Apr 16 18:44:57.374054 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:44:57.373872 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="main" containerID="cri-o://8772b667dec6ebbd059008bb1083dfa2c289361329fe83c55ccda7507f926298" gracePeriod=30 Apr 16 18:45:05.010495 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:05.010437 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 18:45:09.169720 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.169681 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm"] Apr 16 18:45:09.173682 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.173654 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.176490 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.176426 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 18:45:09.195328 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.195291 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm"] Apr 16 18:45:09.274305 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.274265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-model-cache\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.274499 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.274320 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-home\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.274499 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.274354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txkxv\" (UniqueName: \"kubernetes.io/projected/26ce4724-d491-43c5-8c98-df0fd25c7359-kube-api-access-txkxv\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.274499 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.274398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26ce4724-d491-43c5-8c98-df0fd25c7359-tls-certs\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.274499 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.274437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-dshm\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.274499 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.274488 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.375669 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.375629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-model-cache\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.375883 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.375683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-home\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.375883 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.375708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txkxv\" (UniqueName: \"kubernetes.io/projected/26ce4724-d491-43c5-8c98-df0fd25c7359-kube-api-access-txkxv\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.375883 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.375770 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26ce4724-d491-43c5-8c98-df0fd25c7359-tls-certs\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.375883 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.375805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-dshm\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.375883 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.375866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.376165 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.376126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-model-cache\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.376222 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.376172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-home\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.376269 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.376225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.380839 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.378848 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-dshm\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.380839 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.378869 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26ce4724-d491-43c5-8c98-df0fd25c7359-tls-certs\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.385885 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.385848 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txkxv\" (UniqueName: \"kubernetes.io/projected/26ce4724-d491-43c5-8c98-df0fd25c7359-kube-api-access-txkxv\") pod \"custom-route-timeout-test-kserve-55c77bfb77-t4ltm\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.486464 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.486414 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:09.633729 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.633697 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm"] Apr 16 18:45:09.636266 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:45:09.636235 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26ce4724_d491_43c5_8c98_df0fd25c7359.slice/crio-d26bc1341c43c6fc49c3fcdf6cafdce9843a007cd0d2be60011c01ed59cf962f WatchSource:0}: Error finding container d26bc1341c43c6fc49c3fcdf6cafdce9843a007cd0d2be60011c01ed59cf962f: Status 404 returned error can't find the container with id d26bc1341c43c6fc49c3fcdf6cafdce9843a007cd0d2be60011c01ed59cf962f Apr 16 18:45:09.784510 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.784411 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" event={"ID":"26ce4724-d491-43c5-8c98-df0fd25c7359","Type":"ContainerStarted","Data":"c62c8b9351df5bcf549f77031abfcc843ce60eff6a71c1d00c9a8b086547dc45"} Apr 16 18:45:09.784510 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:09.784452 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" event={"ID":"26ce4724-d491-43c5-8c98-df0fd25c7359","Type":"ContainerStarted","Data":"d26bc1341c43c6fc49c3fcdf6cafdce9843a007cd0d2be60011c01ed59cf962f"} Apr 16 18:45:14.808459 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:14.808421 2576 generic.go:358] "Generic (PLEG): container finished" podID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerID="c62c8b9351df5bcf549f77031abfcc843ce60eff6a71c1d00c9a8b086547dc45" exitCode=0 Apr 16 18:45:14.808885 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:14.808492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" event={"ID":"26ce4724-d491-43c5-8c98-df0fd25c7359","Type":"ContainerDied","Data":"c62c8b9351df5bcf549f77031abfcc843ce60eff6a71c1d00c9a8b086547dc45"} Apr 16 18:45:15.010571 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:15.010524 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 18:45:15.815943 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:15.815898 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" event={"ID":"26ce4724-d491-43c5-8c98-df0fd25c7359","Type":"ContainerStarted","Data":"802668de23ee0a8564f4e6e5fcd9a5c62d4becc4ea630487addafeef72bf5668"} Apr 16 18:45:15.841422 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:15.841344 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" podStartSLOduration=6.841321979 podStartE2EDuration="6.841321979s" podCreationTimestamp="2026-04-16 18:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:45:15.838039303 +0000 UTC m=+841.962566968" watchObservedRunningTime="2026-04-16 18:45:15.841321979 +0000 UTC m=+841.965849644" Apr 16 18:45:19.487025 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:19.486963 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:19.487025 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:19.487019 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:45:19.488628 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:19.488594 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 16 18:45:25.010428 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:25.010374 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 18:45:27.858810 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.858780 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx_133cc38b-b9bb-45f9-bec5-1e40151f4310/main/0.log" Apr 16 18:45:27.859209 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.859194 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:45:27.867793 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.867767 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx_133cc38b-b9bb-45f9-bec5-1e40151f4310/main/0.log" Apr 16 18:45:27.868164 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.868140 2576 generic.go:358] "Generic (PLEG): container finished" podID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerID="8772b667dec6ebbd059008bb1083dfa2c289361329fe83c55ccda7507f926298" exitCode=137 Apr 16 18:45:27.868280 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.868176 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" event={"ID":"133cc38b-b9bb-45f9-bec5-1e40151f4310","Type":"ContainerDied","Data":"8772b667dec6ebbd059008bb1083dfa2c289361329fe83c55ccda7507f926298"} Apr 16 18:45:27.868280 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.868205 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" event={"ID":"133cc38b-b9bb-45f9-bec5-1e40151f4310","Type":"ContainerDied","Data":"b642cc6af030edb2b5caef53d4a27b647e68335b0a115b6fc82c488f9d472464"} Apr 16 18:45:27.868280 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.868222 2576 scope.go:117] "RemoveContainer" containerID="8772b667dec6ebbd059008bb1083dfa2c289361329fe83c55ccda7507f926298" Apr 16 18:45:27.868454 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.868290 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx" Apr 16 18:45:27.897547 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.897521 2576 scope.go:117] "RemoveContainer" containerID="7d7a524a14d28ed17513cb93c3b845f377b65c4eba84654ded98ab8ecfe191db" Apr 16 18:45:27.944451 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.944405 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-model-cache\") pod \"133cc38b-b9bb-45f9-bec5-1e40151f4310\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " Apr 16 18:45:27.944652 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.944496 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-dshm\") pod \"133cc38b-b9bb-45f9-bec5-1e40151f4310\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " Apr 16 18:45:27.944652 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.944576 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/133cc38b-b9bb-45f9-bec5-1e40151f4310-tls-certs\") pod \"133cc38b-b9bb-45f9-bec5-1e40151f4310\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " Apr 16 18:45:27.944652 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.944634 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-kserve-provision-location\") pod \"133cc38b-b9bb-45f9-bec5-1e40151f4310\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " Apr 16 18:45:27.944859 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.944678 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-home\") pod \"133cc38b-b9bb-45f9-bec5-1e40151f4310\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " Apr 16 18:45:27.944859 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.944704 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrrwr\" (UniqueName: \"kubernetes.io/projected/133cc38b-b9bb-45f9-bec5-1e40151f4310-kube-api-access-qrrwr\") pod \"133cc38b-b9bb-45f9-bec5-1e40151f4310\" (UID: \"133cc38b-b9bb-45f9-bec5-1e40151f4310\") " Apr 16 18:45:27.944859 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.944698 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-model-cache" (OuterVolumeSpecName: "model-cache") pod "133cc38b-b9bb-45f9-bec5-1e40151f4310" (UID: "133cc38b-b9bb-45f9-bec5-1e40151f4310"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:45:27.945026 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.944963 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:45:27.945082 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.945024 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-home" (OuterVolumeSpecName: "home") pod "133cc38b-b9bb-45f9-bec5-1e40151f4310" (UID: "133cc38b-b9bb-45f9-bec5-1e40151f4310"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:45:27.947814 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.947765 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133cc38b-b9bb-45f9-bec5-1e40151f4310-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "133cc38b-b9bb-45f9-bec5-1e40151f4310" (UID: "133cc38b-b9bb-45f9-bec5-1e40151f4310"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:45:27.955114 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.955053 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/133cc38b-b9bb-45f9-bec5-1e40151f4310-kube-api-access-qrrwr" (OuterVolumeSpecName: "kube-api-access-qrrwr") pod "133cc38b-b9bb-45f9-bec5-1e40151f4310" (UID: "133cc38b-b9bb-45f9-bec5-1e40151f4310"). InnerVolumeSpecName "kube-api-access-qrrwr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:45:27.956472 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.955773 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-dshm" (OuterVolumeSpecName: "dshm") pod "133cc38b-b9bb-45f9-bec5-1e40151f4310" (UID: "133cc38b-b9bb-45f9-bec5-1e40151f4310"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:45:27.969995 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.969967 2576 scope.go:117] "RemoveContainer" containerID="8772b667dec6ebbd059008bb1083dfa2c289361329fe83c55ccda7507f926298" Apr 16 18:45:27.970432 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:45:27.970407 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8772b667dec6ebbd059008bb1083dfa2c289361329fe83c55ccda7507f926298\": container with ID starting with 8772b667dec6ebbd059008bb1083dfa2c289361329fe83c55ccda7507f926298 not found: ID does not exist" containerID="8772b667dec6ebbd059008bb1083dfa2c289361329fe83c55ccda7507f926298" Apr 16 18:45:27.970509 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.970445 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8772b667dec6ebbd059008bb1083dfa2c289361329fe83c55ccda7507f926298"} err="failed to get container status \"8772b667dec6ebbd059008bb1083dfa2c289361329fe83c55ccda7507f926298\": rpc error: code = NotFound desc = could not find container \"8772b667dec6ebbd059008bb1083dfa2c289361329fe83c55ccda7507f926298\": container with ID starting with 8772b667dec6ebbd059008bb1083dfa2c289361329fe83c55ccda7507f926298 not found: ID does not exist" Apr 16 18:45:27.970509 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.970473 2576 scope.go:117] "RemoveContainer" containerID="7d7a524a14d28ed17513cb93c3b845f377b65c4eba84654ded98ab8ecfe191db" Apr 16 18:45:27.970831 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:45:27.970801 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7a524a14d28ed17513cb93c3b845f377b65c4eba84654ded98ab8ecfe191db\": container with ID starting with 7d7a524a14d28ed17513cb93c3b845f377b65c4eba84654ded98ab8ecfe191db not found: ID does not exist" containerID="7d7a524a14d28ed17513cb93c3b845f377b65c4eba84654ded98ab8ecfe191db" Apr 16 18:45:27.970953 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:27.970838 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7a524a14d28ed17513cb93c3b845f377b65c4eba84654ded98ab8ecfe191db"} err="failed to get container status \"7d7a524a14d28ed17513cb93c3b845f377b65c4eba84654ded98ab8ecfe191db\": rpc error: code = NotFound desc = could not find container \"7d7a524a14d28ed17513cb93c3b845f377b65c4eba84654ded98ab8ecfe191db\": container with ID starting with 7d7a524a14d28ed17513cb93c3b845f377b65c4eba84654ded98ab8ecfe191db not found: ID does not exist" Apr 16 18:45:28.005731 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:28.005686 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "133cc38b-b9bb-45f9-bec5-1e40151f4310" (UID: "133cc38b-b9bb-45f9-bec5-1e40151f4310"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:45:28.045446 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:28.045404 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:45:28.045446 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:28.045435 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/133cc38b-b9bb-45f9-bec5-1e40151f4310-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:45:28.045446 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:28.045446 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:45:28.045446 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:28.045457 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/133cc38b-b9bb-45f9-bec5-1e40151f4310-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:45:28.045864 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:28.045468 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrrwr\" (UniqueName: \"kubernetes.io/projected/133cc38b-b9bb-45f9-bec5-1e40151f4310-kube-api-access-qrrwr\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:45:28.194834 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:28.194792 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx"] Apr 16 18:45:28.200878 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:28.200849 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b8bf57799-m6zmx"] Apr 16 18:45:28.507096 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:28.507063 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" path="/var/lib/kubelet/pods/133cc38b-b9bb-45f9-bec5-1e40151f4310/volumes" Apr 16 18:45:29.487275 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:29.487227 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 16 18:45:35.010769 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:35.010694 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 18:45:39.487681 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:39.487621 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 16 18:45:45.010890 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:45.010838 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 18:45:49.487186 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:49.487131 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 16 18:45:55.020432 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:55.020397 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:45:55.028334 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:55.028304 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:45:56.107142 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:45:56.107096 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:45:56.107690 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:45:56.107199 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs podName:1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f nodeName:}" failed. No retries permitted until 2026-04-16 18:45:56.607178744 +0000 UTC m=+882.731706396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs") pod "stop-feature-test-kserve-85568b7f4f-fbdfs" (UID: "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:45:56.153125 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:56.153087 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs"] Apr 16 18:45:56.433613 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:56.433519 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c"] Apr 16 18:45:56.433886 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:56.433854 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" podUID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerName="main" containerID="cri-o://bf77f7bd98a582ac6d255476c65928ac26e0fb170dafa48d5c13137a4f38e9dc" gracePeriod=30 Apr 16 18:45:56.434095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:56.433887 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" podUID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerName="tokenizer" containerID="cri-o://06203ae3f453099da4bff9a441de3784bea8fa07623992b76808d2b286cb4075" gracePeriod=30 Apr 16 18:45:56.548601 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:45:56.548570 2576 logging.go:55] [core] [Channel #101 SubChannel #102]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.47:9003", ServerName: "10.134.0.47:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.47:9003: connect: connection refused" Apr 16 18:45:56.613117 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:45:56.613079 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:45:56.613316 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:45:56.613179 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs podName:1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f nodeName:}" failed. No retries permitted until 2026-04-16 18:45:57.613159535 +0000 UTC m=+883.737687177 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs") pod "stop-feature-test-kserve-85568b7f4f-fbdfs" (UID: "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:45:57.010457 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.010419 2576 generic.go:358] "Generic (PLEG): container finished" podID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerID="bf77f7bd98a582ac6d255476c65928ac26e0fb170dafa48d5c13137a4f38e9dc" exitCode=0 Apr 16 18:45:57.010685 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.010496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" event={"ID":"91c99df0-8354-46fd-b778-fc6b37fcd877","Type":"ContainerDied","Data":"bf77f7bd98a582ac6d255476c65928ac26e0fb170dafa48d5c13137a4f38e9dc"} Apr 16 18:45:57.010804 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.010782 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" containerID="cri-o://af02c6daa4c715ee59410fe05c023d1ad8bb4942e541231bf3068d6797eed757" gracePeriod=30 Apr 16 18:45:57.548731 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.548676 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" podUID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.47:9003\" within 1s: context deadline exceeded" Apr 16 18:45:57.621657 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:45:57.621622 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:45:57.621891 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:45:57.621717 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs podName:1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f nodeName:}" failed. No retries permitted until 2026-04-16 18:45:59.621698843 +0000 UTC m=+885.746226487 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs") pod "stop-feature-test-kserve-85568b7f4f-fbdfs" (UID: "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:45:57.894343 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.894314 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:45:57.922859 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.922824 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-kserve-provision-location\") pod \"91c99df0-8354-46fd-b778-fc6b37fcd877\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " Apr 16 18:45:57.923045 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.922882 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsxsv\" (UniqueName: \"kubernetes.io/projected/91c99df0-8354-46fd-b778-fc6b37fcd877-kube-api-access-nsxsv\") pod \"91c99df0-8354-46fd-b778-fc6b37fcd877\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " Apr 16 18:45:57.923045 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.922912 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-tmp\") pod \"91c99df0-8354-46fd-b778-fc6b37fcd877\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " Apr 16 18:45:57.923045 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.922934 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-cache\") pod \"91c99df0-8354-46fd-b778-fc6b37fcd877\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " Apr 16 18:45:57.923045 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.922973 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91c99df0-8354-46fd-b778-fc6b37fcd877-tls-certs\") pod \"91c99df0-8354-46fd-b778-fc6b37fcd877\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " Apr 16 18:45:57.923045 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.923027 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-uds\") pod \"91c99df0-8354-46fd-b778-fc6b37fcd877\" (UID: \"91c99df0-8354-46fd-b778-fc6b37fcd877\") " Apr 16 18:45:57.923318 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.923141 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "91c99df0-8354-46fd-b778-fc6b37fcd877" (UID: "91c99df0-8354-46fd-b778-fc6b37fcd877"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:45:57.923318 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.923244 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "91c99df0-8354-46fd-b778-fc6b37fcd877" (UID: "91c99df0-8354-46fd-b778-fc6b37fcd877"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:45:57.923318 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.923307 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-tmp\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:45:57.923478 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.923325 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:45:57.923478 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.923357 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "91c99df0-8354-46fd-b778-fc6b37fcd877" (UID: "91c99df0-8354-46fd-b778-fc6b37fcd877"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:45:57.923601 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.923587 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "91c99df0-8354-46fd-b778-fc6b37fcd877" (UID: "91c99df0-8354-46fd-b778-fc6b37fcd877"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:45:57.925256 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.925226 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c99df0-8354-46fd-b778-fc6b37fcd877-kube-api-access-nsxsv" (OuterVolumeSpecName: "kube-api-access-nsxsv") pod "91c99df0-8354-46fd-b778-fc6b37fcd877" (UID: "91c99df0-8354-46fd-b778-fc6b37fcd877"). InnerVolumeSpecName "kube-api-access-nsxsv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:45:57.925378 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:57.925270 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c99df0-8354-46fd-b778-fc6b37fcd877-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "91c99df0-8354-46fd-b778-fc6b37fcd877" (UID: "91c99df0-8354-46fd-b778-fc6b37fcd877"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:45:58.016019 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.015988 2576 generic.go:358] "Generic (PLEG): container finished" podID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerID="06203ae3f453099da4bff9a441de3784bea8fa07623992b76808d2b286cb4075" exitCode=0 Apr 16 18:45:58.016179 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.016072 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" Apr 16 18:45:58.016179 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.016070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" event={"ID":"91c99df0-8354-46fd-b778-fc6b37fcd877","Type":"ContainerDied","Data":"06203ae3f453099da4bff9a441de3784bea8fa07623992b76808d2b286cb4075"} Apr 16 18:45:58.016179 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.016112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c" event={"ID":"91c99df0-8354-46fd-b778-fc6b37fcd877","Type":"ContainerDied","Data":"e5e32d3899022f4bb77593f22f8ee9b503e4bd24749aa7cc9c316dcf9d9f6899"} Apr 16 18:45:58.016179 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.016137 2576 scope.go:117] "RemoveContainer" containerID="06203ae3f453099da4bff9a441de3784bea8fa07623992b76808d2b286cb4075" Apr 16 18:45:58.023992 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.023962 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nsxsv\" (UniqueName: \"kubernetes.io/projected/91c99df0-8354-46fd-b778-fc6b37fcd877-kube-api-access-nsxsv\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:45:58.023992 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.023994 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91c99df0-8354-46fd-b778-fc6b37fcd877-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:45:58.024205 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.024007 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-tokenizer-uds\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:45:58.024205 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.024017 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91c99df0-8354-46fd-b778-fc6b37fcd877-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:45:58.028964 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.028934 2576 scope.go:117] "RemoveContainer" containerID="bf77f7bd98a582ac6d255476c65928ac26e0fb170dafa48d5c13137a4f38e9dc" Apr 16 18:45:58.037861 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.037839 2576 scope.go:117] "RemoveContainer" containerID="b792daeaf35af591e403ad5f6f96db685eb1c4cdff6f69a4d45dc4bcc1d9bbd6" Apr 16 18:45:58.044338 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.044304 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c"] Apr 16 18:45:58.048867 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.048775 2576 scope.go:117] "RemoveContainer" containerID="06203ae3f453099da4bff9a441de3784bea8fa07623992b76808d2b286cb4075" Apr 16 18:45:58.049124 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.049102 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m9r5c"] Apr 16 18:45:58.049195 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:45:58.049124 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06203ae3f453099da4bff9a441de3784bea8fa07623992b76808d2b286cb4075\": container with ID starting with 06203ae3f453099da4bff9a441de3784bea8fa07623992b76808d2b286cb4075 not found: ID does not exist" containerID="06203ae3f453099da4bff9a441de3784bea8fa07623992b76808d2b286cb4075" Apr 16 18:45:58.049195 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.049159 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06203ae3f453099da4bff9a441de3784bea8fa07623992b76808d2b286cb4075"} err="failed to get container status \"06203ae3f453099da4bff9a441de3784bea8fa07623992b76808d2b286cb4075\": rpc error: code = NotFound desc = could not find container \"06203ae3f453099da4bff9a441de3784bea8fa07623992b76808d2b286cb4075\": container with ID starting with 06203ae3f453099da4bff9a441de3784bea8fa07623992b76808d2b286cb4075 not found: ID does not exist" Apr 16 18:45:58.049195 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.049189 2576 scope.go:117] "RemoveContainer" containerID="bf77f7bd98a582ac6d255476c65928ac26e0fb170dafa48d5c13137a4f38e9dc" Apr 16 18:45:58.049504 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:45:58.049487 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf77f7bd98a582ac6d255476c65928ac26e0fb170dafa48d5c13137a4f38e9dc\": container with ID starting with bf77f7bd98a582ac6d255476c65928ac26e0fb170dafa48d5c13137a4f38e9dc not found: ID does not exist" containerID="bf77f7bd98a582ac6d255476c65928ac26e0fb170dafa48d5c13137a4f38e9dc" Apr 16 18:45:58.049544 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.049510 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf77f7bd98a582ac6d255476c65928ac26e0fb170dafa48d5c13137a4f38e9dc"} err="failed to get container status \"bf77f7bd98a582ac6d255476c65928ac26e0fb170dafa48d5c13137a4f38e9dc\": rpc error: code = NotFound desc = could not find container \"bf77f7bd98a582ac6d255476c65928ac26e0fb170dafa48d5c13137a4f38e9dc\": container with ID starting with bf77f7bd98a582ac6d255476c65928ac26e0fb170dafa48d5c13137a4f38e9dc not found: ID does not exist" Apr 16 18:45:58.049544 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.049528 2576 scope.go:117] "RemoveContainer" containerID="b792daeaf35af591e403ad5f6f96db685eb1c4cdff6f69a4d45dc4bcc1d9bbd6" Apr 16 18:45:58.049785 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:45:58.049769 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b792daeaf35af591e403ad5f6f96db685eb1c4cdff6f69a4d45dc4bcc1d9bbd6\": container with ID starting with b792daeaf35af591e403ad5f6f96db685eb1c4cdff6f69a4d45dc4bcc1d9bbd6 not found: ID does not exist" containerID="b792daeaf35af591e403ad5f6f96db685eb1c4cdff6f69a4d45dc4bcc1d9bbd6" Apr 16 18:45:58.049841 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.049791 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b792daeaf35af591e403ad5f6f96db685eb1c4cdff6f69a4d45dc4bcc1d9bbd6"} err="failed to get container status \"b792daeaf35af591e403ad5f6f96db685eb1c4cdff6f69a4d45dc4bcc1d9bbd6\": rpc error: code = NotFound desc = could not find container \"b792daeaf35af591e403ad5f6f96db685eb1c4cdff6f69a4d45dc4bcc1d9bbd6\": container with ID starting with b792daeaf35af591e403ad5f6f96db685eb1c4cdff6f69a4d45dc4bcc1d9bbd6 not found: ID does not exist" Apr 16 18:45:58.506541 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:58.506512 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91c99df0-8354-46fd-b778-fc6b37fcd877" path="/var/lib/kubelet/pods/91c99df0-8354-46fd-b778-fc6b37fcd877/volumes" Apr 16 18:45:59.487470 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:45:59.487426 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 16 18:45:59.636504 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:45:59.636472 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:45:59.636692 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:45:59.636557 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs podName:1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f nodeName:}" failed. No retries permitted until 2026-04-16 18:46:03.63653329 +0000 UTC m=+889.761060935 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs") pod "stop-feature-test-kserve-85568b7f4f-fbdfs" (UID: "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:46:03.671050 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:46:03.670966 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:46:03.671050 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:46:03.671043 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs podName:1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f nodeName:}" failed. No retries permitted until 2026-04-16 18:46:11.671023339 +0000 UTC m=+897.795550980 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs") pod "stop-feature-test-kserve-85568b7f4f-fbdfs" (UID: "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:46:05.487293 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.487257 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv"] Apr 16 18:46:05.487808 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.487790 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerName="storage-initializer" Apr 16 18:46:05.487907 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.487811 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerName="storage-initializer" Apr 16 18:46:05.487907 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.487822 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerName="main" Apr 16 18:46:05.487907 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.487831 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerName="main" Apr 16 18:46:05.487907 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.487849 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="main" Apr 16 18:46:05.487907 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.487857 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="main" Apr 16 18:46:05.487907 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.487878 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerName="tokenizer" Apr 16 18:46:05.487907 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.487887 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerName="tokenizer" Apr 16 18:46:05.487907 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.487900 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="storage-initializer" Apr 16 18:46:05.487907 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.487908 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="storage-initializer" Apr 16 18:46:05.488378 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.488028 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="133cc38b-b9bb-45f9-bec5-1e40151f4310" containerName="main" Apr 16 18:46:05.488378 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.488043 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerName="main" Apr 16 18:46:05.488378 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.488053 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="91c99df0-8354-46fd-b778-fc6b37fcd877" containerName="tokenizer" Apr 16 18:46:05.493102 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.493076 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.505576 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.505548 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv"] Apr 16 18:46:05.588316 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.588279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv8hc\" (UniqueName: \"kubernetes.io/projected/e964dec8-c773-44cd-88f7-2d82b12a10bf-kube-api-access-cv8hc\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.588481 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.588447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e964dec8-c773-44cd-88f7-2d82b12a10bf-tls-certs\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.588583 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.588520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-home\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.588649 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.588581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-kserve-provision-location\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.588649 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.588614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-dshm\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.588786 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.588644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-model-cache\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.690274 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.690226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e964dec8-c773-44cd-88f7-2d82b12a10bf-tls-certs\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.690452 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.690409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-home\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.690536 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.690451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-kserve-provision-location\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.690536 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.690481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-dshm\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.690536 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.690507 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-model-cache\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.690704 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.690593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cv8hc\" (UniqueName: \"kubernetes.io/projected/e964dec8-c773-44cd-88f7-2d82b12a10bf-kube-api-access-cv8hc\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.691067 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.691043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-kserve-provision-location\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.691261 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.691077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-home\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.691261 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.691118 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-model-cache\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.693123 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.693095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-dshm\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.693261 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.693239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e964dec8-c773-44cd-88f7-2d82b12a10bf-tls-certs\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.711164 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.711133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv8hc\" (UniqueName: \"kubernetes.io/projected/e964dec8-c773-44cd-88f7-2d82b12a10bf-kube-api-access-cv8hc\") pod \"stop-feature-test-kserve-85568b7f4f-jhlxv\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.810801 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.810686 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:05.957542 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:05.957505 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv"] Apr 16 18:46:05.958570 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:46:05.958544 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode964dec8_c773_44cd_88f7_2d82b12a10bf.slice/crio-c601509dea98cfb8a860e0f47f072d8306a9c666d787aa9610d5d08a4d8fab38 WatchSource:0}: Error finding container c601509dea98cfb8a860e0f47f072d8306a9c666d787aa9610d5d08a4d8fab38: Status 404 returned error can't find the container with id c601509dea98cfb8a860e0f47f072d8306a9c666d787aa9610d5d08a4d8fab38 Apr 16 18:46:06.052078 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:06.052036 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" event={"ID":"e964dec8-c773-44cd-88f7-2d82b12a10bf","Type":"ContainerStarted","Data":"66f50e9fb54f6ead8fea1352e9b3a8f31b7b5285555cd45c8f1f8ffba277f58b"} Apr 16 18:46:06.052078 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:06.052079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" event={"ID":"e964dec8-c773-44cd-88f7-2d82b12a10bf","Type":"ContainerStarted","Data":"c601509dea98cfb8a860e0f47f072d8306a9c666d787aa9610d5d08a4d8fab38"} Apr 16 18:46:09.487696 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:09.487641 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 16 18:46:11.072946 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:11.072912 2576 generic.go:358] "Generic (PLEG): container finished" podID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerID="66f50e9fb54f6ead8fea1352e9b3a8f31b7b5285555cd45c8f1f8ffba277f58b" exitCode=0 Apr 16 18:46:11.073303 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:11.072998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" event={"ID":"e964dec8-c773-44cd-88f7-2d82b12a10bf","Type":"ContainerDied","Data":"66f50e9fb54f6ead8fea1352e9b3a8f31b7b5285555cd45c8f1f8ffba277f58b"} Apr 16 18:46:12.079593 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:12.079556 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" event={"ID":"e964dec8-c773-44cd-88f7-2d82b12a10bf","Type":"ContainerStarted","Data":"33ff1e9818803867610e23e97c1d29a25b5de33b8de209927839330c0161fa20"} Apr 16 18:46:12.103988 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:12.103938 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" podStartSLOduration=7.103917122 podStartE2EDuration="7.103917122s" podCreationTimestamp="2026-04-16 18:46:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:46:12.1016013 +0000 UTC m=+898.226128964" watchObservedRunningTime="2026-04-16 18:46:12.103917122 +0000 UTC m=+898.228444821" Apr 16 18:46:14.477954 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:14.477916 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/2.log" Apr 16 18:46:14.480847 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:14.480823 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/2.log" Apr 16 18:46:15.811046 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:15.810991 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:15.811046 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:15.811046 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:46:15.812890 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:15.812854 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 18:46:19.488041 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:19.487985 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 16 18:46:25.811981 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:25.811923 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 18:46:27.295352 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.295325 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-85568b7f4f-fbdfs_1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f/main/0.log" Apr 16 18:46:27.295859 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.295757 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:46:27.403006 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.402964 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-model-cache\") pod \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " Apr 16 18:46:27.403177 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.403032 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-dshm\") pod \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " Apr 16 18:46:27.403177 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.403064 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-kserve-provision-location\") pod \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " Apr 16 18:46:27.403177 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.403127 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-home\") pod \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " Apr 16 18:46:27.403347 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.403188 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2l6r\" (UniqueName: \"kubernetes.io/projected/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-kube-api-access-m2l6r\") pod \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " Apr 16 18:46:27.403347 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.403303 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-model-cache" (OuterVolumeSpecName: "model-cache") pod "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" (UID: "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:46:27.403521 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.403488 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-home" (OuterVolumeSpecName: "home") pod "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" (UID: "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:46:27.403686 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.403666 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs\") pod \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\" (UID: \"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f\") " Apr 16 18:46:27.404043 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.404019 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:46:27.404153 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.404049 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:46:27.405728 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.405702 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-dshm" (OuterVolumeSpecName: "dshm") pod "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" (UID: "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:46:27.405969 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.405945 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-kube-api-access-m2l6r" (OuterVolumeSpecName: "kube-api-access-m2l6r") pod "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" (UID: "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f"). InnerVolumeSpecName "kube-api-access-m2l6r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:46:27.406051 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.405973 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" (UID: "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:46:27.463172 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.463127 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" (UID: "1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:46:27.504810 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.504771 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:46:27.504810 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.504804 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:46:27.504810 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.504814 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:46:27.505033 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:27.504823 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m2l6r\" (UniqueName: \"kubernetes.io/projected/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f-kube-api-access-m2l6r\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:46:28.152501 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.152474 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-85568b7f4f-fbdfs_1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f/main/0.log" Apr 16 18:46:28.152894 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.152865 2576 generic.go:358] "Generic (PLEG): container finished" podID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerID="af02c6daa4c715ee59410fe05c023d1ad8bb4942e541231bf3068d6797eed757" exitCode=137 Apr 16 18:46:28.153012 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.152996 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" Apr 16 18:46:28.153095 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.152997 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" event={"ID":"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f","Type":"ContainerDied","Data":"af02c6daa4c715ee59410fe05c023d1ad8bb4942e541231bf3068d6797eed757"} Apr 16 18:46:28.153146 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.153117 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs" event={"ID":"1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f","Type":"ContainerDied","Data":"be1dc19d3f27df60fae2d995114421cd36c5ea3301db8d7cade6fdc59bf46a8c"} Apr 16 18:46:28.153146 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.153140 2576 scope.go:117] "RemoveContainer" containerID="af02c6daa4c715ee59410fe05c023d1ad8bb4942e541231bf3068d6797eed757" Apr 16 18:46:28.176289 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.176252 2576 scope.go:117] "RemoveContainer" containerID="fea347aad9629974d7918f06a6d6c39510382f806552804a4b1187dcec6363e4" Apr 16 18:46:28.179650 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.179620 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs"] Apr 16 18:46:28.184594 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.184563 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-fbdfs"] Apr 16 18:46:28.252188 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.252163 2576 scope.go:117] "RemoveContainer" containerID="af02c6daa4c715ee59410fe05c023d1ad8bb4942e541231bf3068d6797eed757" Apr 16 18:46:28.252520 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:46:28.252500 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af02c6daa4c715ee59410fe05c023d1ad8bb4942e541231bf3068d6797eed757\": container with ID starting with af02c6daa4c715ee59410fe05c023d1ad8bb4942e541231bf3068d6797eed757 not found: ID does not exist" containerID="af02c6daa4c715ee59410fe05c023d1ad8bb4942e541231bf3068d6797eed757" Apr 16 18:46:28.252616 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.252529 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af02c6daa4c715ee59410fe05c023d1ad8bb4942e541231bf3068d6797eed757"} err="failed to get container status \"af02c6daa4c715ee59410fe05c023d1ad8bb4942e541231bf3068d6797eed757\": rpc error: code = NotFound desc = could not find container \"af02c6daa4c715ee59410fe05c023d1ad8bb4942e541231bf3068d6797eed757\": container with ID starting with af02c6daa4c715ee59410fe05c023d1ad8bb4942e541231bf3068d6797eed757 not found: ID does not exist" Apr 16 18:46:28.252616 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.252557 2576 scope.go:117] "RemoveContainer" containerID="fea347aad9629974d7918f06a6d6c39510382f806552804a4b1187dcec6363e4" Apr 16 18:46:28.252923 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:46:28.252900 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea347aad9629974d7918f06a6d6c39510382f806552804a4b1187dcec6363e4\": container with ID starting with fea347aad9629974d7918f06a6d6c39510382f806552804a4b1187dcec6363e4 not found: ID does not exist" containerID="fea347aad9629974d7918f06a6d6c39510382f806552804a4b1187dcec6363e4" Apr 16 18:46:28.253027 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.252930 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea347aad9629974d7918f06a6d6c39510382f806552804a4b1187dcec6363e4"} err="failed to get container status \"fea347aad9629974d7918f06a6d6c39510382f806552804a4b1187dcec6363e4\": rpc error: code = NotFound desc = could not find container \"fea347aad9629974d7918f06a6d6c39510382f806552804a4b1187dcec6363e4\": container with ID starting with fea347aad9629974d7918f06a6d6c39510382f806552804a4b1187dcec6363e4 not found: ID does not exist" Apr 16 18:46:28.509234 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:28.509196 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" path="/var/lib/kubelet/pods/1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f/volumes" Apr 16 18:46:29.487476 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:29.487428 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 16 18:46:35.811441 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:35.811391 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 18:46:39.487376 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:39.487322 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 16 18:46:45.812156 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:45.812104 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 18:46:49.487890 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:49.487821 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 16 18:46:55.811801 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:55.811726 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 18:46:59.501775 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:59.501723 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:46:59.512035 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:46:59.511998 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:47:05.811927 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:05.811872 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 18:47:06.026321 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:06.026278 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm"] Apr 16 18:47:06.026695 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:06.026644 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" containerID="cri-o://802668de23ee0a8564f4e6e5fcd9a5c62d4becc4ea630487addafeef72bf5668" gracePeriod=30 Apr 16 18:47:15.812301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:15.812242 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 18:47:22.782426 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.782386 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49"] Apr 16 18:47:22.782970 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.782948 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="storage-initializer" Apr 16 18:47:22.783024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.782975 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="storage-initializer" Apr 16 18:47:22.783024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.783012 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" Apr 16 18:47:22.783024 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.783021 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" Apr 16 18:47:22.783145 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.783131 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1407a00a-ee0d-4dbd-a4ef-266fa7c7fc8f" containerName="main" Apr 16 18:47:22.788187 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.788163 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:22.791098 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.791075 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 18:47:22.794889 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.794864 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49"] Apr 16 18:47:22.924722 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.924675 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-dshm\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:22.924722 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.924727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8qwr\" (UniqueName: \"kubernetes.io/projected/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-kube-api-access-h8qwr\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:22.925018 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.924804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-home\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:22.925018 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.924829 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-model-cache\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:22.925018 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.924894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-tls-certs\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:22.925018 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:22.924940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-kserve-provision-location\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.025436 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.025397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-tls-certs\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.025436 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.025440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-kserve-provision-location\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.025698 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.025527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-dshm\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.025698 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.025556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8qwr\" (UniqueName: \"kubernetes.io/projected/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-kube-api-access-h8qwr\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.025698 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.025594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-home\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.025882 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.025791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-model-cache\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.025969 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.025939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-kserve-provision-location\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.026104 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.025973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-home\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.026180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.026158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-model-cache\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.027945 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.027922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-dshm\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.028055 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.028018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-tls-certs\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.035990 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.035930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8qwr\" (UniqueName: \"kubernetes.io/projected/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-kube-api-access-h8qwr\") pod \"router-with-refs-test-kserve-9bbdcd449-9kx49\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.099963 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.099911 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:23.279020 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.278991 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49"] Apr 16 18:47:23.281610 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:47:23.281580 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa5ee4f1_adeb_4a17_8a24_2f6125ff44a6.slice/crio-145e2128dcb748ade3ee1110a35ef38f39a1e72d8c0fe72b91510127bb858a85 WatchSource:0}: Error finding container 145e2128dcb748ade3ee1110a35ef38f39a1e72d8c0fe72b91510127bb858a85: Status 404 returned error can't find the container with id 145e2128dcb748ade3ee1110a35ef38f39a1e72d8c0fe72b91510127bb858a85 Apr 16 18:47:23.283416 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.283396 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:47:23.398345 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.398297 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" event={"ID":"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6","Type":"ContainerStarted","Data":"6dead1401aeda2a563153917d0b7ab9800c6f30504f1c43df0887ab6087336d6"} Apr 16 18:47:23.398345 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:23.398349 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" event={"ID":"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6","Type":"ContainerStarted","Data":"145e2128dcb748ade3ee1110a35ef38f39a1e72d8c0fe72b91510127bb858a85"} Apr 16 18:47:25.811985 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:25.811942 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 18:47:28.425504 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:28.425464 2576 generic.go:358] "Generic (PLEG): container finished" podID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerID="6dead1401aeda2a563153917d0b7ab9800c6f30504f1c43df0887ab6087336d6" exitCode=0 Apr 16 18:47:28.426017 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:28.425516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" event={"ID":"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6","Type":"ContainerDied","Data":"6dead1401aeda2a563153917d0b7ab9800c6f30504f1c43df0887ab6087336d6"} Apr 16 18:47:29.432334 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:29.432298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" event={"ID":"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6","Type":"ContainerStarted","Data":"5147a3a992ef5111f86a4c78e6a27066dd56fa36298b31c97430aae2e03a36bc"} Apr 16 18:47:29.459372 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:29.459308 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" podStartSLOduration=7.459284072 podStartE2EDuration="7.459284072s" podCreationTimestamp="2026-04-16 18:47:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:47:29.455582898 +0000 UTC m=+975.580110568" watchObservedRunningTime="2026-04-16 18:47:29.459284072 +0000 UTC m=+975.583811737" Apr 16 18:47:33.100853 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:33.100759 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:33.100853 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:33.100822 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:47:33.102592 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:33.102563 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 18:47:35.811772 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:35.811704 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 18:47:36.323309 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.323277 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-55c77bfb77-t4ltm_26ce4724-d491-43c5-8c98-df0fd25c7359/main/0.log" Apr 16 18:47:36.323715 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.323698 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:47:36.459301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.459227 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-home\") pod \"26ce4724-d491-43c5-8c98-df0fd25c7359\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " Apr 16 18:47:36.459301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.459276 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txkxv\" (UniqueName: \"kubernetes.io/projected/26ce4724-d491-43c5-8c98-df0fd25c7359-kube-api-access-txkxv\") pod \"26ce4724-d491-43c5-8c98-df0fd25c7359\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " Apr 16 18:47:36.459582 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.459340 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-model-cache\") pod \"26ce4724-d491-43c5-8c98-df0fd25c7359\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " Apr 16 18:47:36.459582 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.459404 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-dshm\") pod \"26ce4724-d491-43c5-8c98-df0fd25c7359\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " Apr 16 18:47:36.459582 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.459453 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26ce4724-d491-43c5-8c98-df0fd25c7359-tls-certs\") pod \"26ce4724-d491-43c5-8c98-df0fd25c7359\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " Apr 16 18:47:36.459582 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.459477 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-kserve-provision-location\") pod \"26ce4724-d491-43c5-8c98-df0fd25c7359\" (UID: \"26ce4724-d491-43c5-8c98-df0fd25c7359\") " Apr 16 18:47:36.459828 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.459619 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-home" (OuterVolumeSpecName: "home") pod "26ce4724-d491-43c5-8c98-df0fd25c7359" (UID: "26ce4724-d491-43c5-8c98-df0fd25c7359"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:36.459828 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.459798 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:47:36.460622 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.460492 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-model-cache" (OuterVolumeSpecName: "model-cache") pod "26ce4724-d491-43c5-8c98-df0fd25c7359" (UID: "26ce4724-d491-43c5-8c98-df0fd25c7359"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:36.462393 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.462237 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26ce4724-d491-43c5-8c98-df0fd25c7359-kube-api-access-txkxv" (OuterVolumeSpecName: "kube-api-access-txkxv") pod "26ce4724-d491-43c5-8c98-df0fd25c7359" (UID: "26ce4724-d491-43c5-8c98-df0fd25c7359"). InnerVolumeSpecName "kube-api-access-txkxv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:47:36.462593 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.462552 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-dshm" (OuterVolumeSpecName: "dshm") pod "26ce4724-d491-43c5-8c98-df0fd25c7359" (UID: "26ce4724-d491-43c5-8c98-df0fd25c7359"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:36.462593 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.462557 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ce4724-d491-43c5-8c98-df0fd25c7359-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "26ce4724-d491-43c5-8c98-df0fd25c7359" (UID: "26ce4724-d491-43c5-8c98-df0fd25c7359"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:47:36.465677 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.465649 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-55c77bfb77-t4ltm_26ce4724-d491-43c5-8c98-df0fd25c7359/main/0.log" Apr 16 18:47:36.466105 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.466076 2576 generic.go:358] "Generic (PLEG): container finished" podID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerID="802668de23ee0a8564f4e6e5fcd9a5c62d4becc4ea630487addafeef72bf5668" exitCode=137 Apr 16 18:47:36.466211 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.466173 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" Apr 16 18:47:36.466211 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.466171 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" event={"ID":"26ce4724-d491-43c5-8c98-df0fd25c7359","Type":"ContainerDied","Data":"802668de23ee0a8564f4e6e5fcd9a5c62d4becc4ea630487addafeef72bf5668"} Apr 16 18:47:36.466329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.466219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm" event={"ID":"26ce4724-d491-43c5-8c98-df0fd25c7359","Type":"ContainerDied","Data":"d26bc1341c43c6fc49c3fcdf6cafdce9843a007cd0d2be60011c01ed59cf962f"} Apr 16 18:47:36.466329 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.466241 2576 scope.go:117] "RemoveContainer" containerID="802668de23ee0a8564f4e6e5fcd9a5c62d4becc4ea630487addafeef72bf5668" Apr 16 18:47:36.507600 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.507475 2576 scope.go:117] "RemoveContainer" containerID="c62c8b9351df5bcf549f77031abfcc843ce60eff6a71c1d00c9a8b086547dc45" Apr 16 18:47:36.527425 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.527381 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "26ce4724-d491-43c5-8c98-df0fd25c7359" (UID: "26ce4724-d491-43c5-8c98-df0fd25c7359"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:36.561266 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.561224 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:47:36.561266 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.561253 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26ce4724-d491-43c5-8c98-df0fd25c7359-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:47:36.561266 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.561265 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:47:36.561266 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.561279 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-txkxv\" (UniqueName: \"kubernetes.io/projected/26ce4724-d491-43c5-8c98-df0fd25c7359-kube-api-access-txkxv\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:47:36.561632 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.561292 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/26ce4724-d491-43c5-8c98-df0fd25c7359-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:47:36.573148 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.573123 2576 scope.go:117] "RemoveContainer" containerID="802668de23ee0a8564f4e6e5fcd9a5c62d4becc4ea630487addafeef72bf5668" Apr 16 18:47:36.573516 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:47:36.573481 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"802668de23ee0a8564f4e6e5fcd9a5c62d4becc4ea630487addafeef72bf5668\": container with ID starting with 802668de23ee0a8564f4e6e5fcd9a5c62d4becc4ea630487addafeef72bf5668 not found: ID does not exist" containerID="802668de23ee0a8564f4e6e5fcd9a5c62d4becc4ea630487addafeef72bf5668" Apr 16 18:47:36.573639 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.573523 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"802668de23ee0a8564f4e6e5fcd9a5c62d4becc4ea630487addafeef72bf5668"} err="failed to get container status \"802668de23ee0a8564f4e6e5fcd9a5c62d4becc4ea630487addafeef72bf5668\": rpc error: code = NotFound desc = could not find container \"802668de23ee0a8564f4e6e5fcd9a5c62d4becc4ea630487addafeef72bf5668\": container with ID starting with 802668de23ee0a8564f4e6e5fcd9a5c62d4becc4ea630487addafeef72bf5668 not found: ID does not exist" Apr 16 18:47:36.573639 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.573543 2576 scope.go:117] "RemoveContainer" containerID="c62c8b9351df5bcf549f77031abfcc843ce60eff6a71c1d00c9a8b086547dc45" Apr 16 18:47:36.573936 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:47:36.573809 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c62c8b9351df5bcf549f77031abfcc843ce60eff6a71c1d00c9a8b086547dc45\": container with ID starting with c62c8b9351df5bcf549f77031abfcc843ce60eff6a71c1d00c9a8b086547dc45 not found: ID does not exist" containerID="c62c8b9351df5bcf549f77031abfcc843ce60eff6a71c1d00c9a8b086547dc45" Apr 16 18:47:36.573936 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.573829 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c62c8b9351df5bcf549f77031abfcc843ce60eff6a71c1d00c9a8b086547dc45"} err="failed to get container status \"c62c8b9351df5bcf549f77031abfcc843ce60eff6a71c1d00c9a8b086547dc45\": rpc error: code = NotFound desc = could not find container \"c62c8b9351df5bcf549f77031abfcc843ce60eff6a71c1d00c9a8b086547dc45\": container with ID starting with c62c8b9351df5bcf549f77031abfcc843ce60eff6a71c1d00c9a8b086547dc45 not found: ID does not exist" Apr 16 18:47:36.792659 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.792612 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm"] Apr 16 18:47:36.796477 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:36.796444 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-55c77bfb77-t4ltm"] Apr 16 18:47:38.506260 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:38.506226 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" path="/var/lib/kubelet/pods/26ce4724-d491-43c5-8c98-df0fd25c7359/volumes" Apr 16 18:47:43.101414 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:43.101367 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 18:47:45.811525 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:45.811463 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 18:47:53.100953 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:53.100910 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 18:47:55.821210 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:55.821174 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:47:55.829888 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:55.829862 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:47:56.970204 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:56.970172 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv"] Apr 16 18:47:57.563669 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:47:57.563627 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" containerID="cri-o://33ff1e9818803867610e23e97c1d29a25b5de33b8de209927839330c0161fa20" gracePeriod=30 Apr 16 18:48:03.101400 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:03.101356 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 18:48:13.101111 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:13.101062 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 18:48:23.100838 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:23.100793 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 18:48:27.863690 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.863663 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-85568b7f4f-jhlxv_e964dec8-c773-44cd-88f7-2d82b12a10bf/main/0.log" Apr 16 18:48:27.864115 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.864099 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:48:27.947720 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.947676 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-model-cache\") pod \"e964dec8-c773-44cd-88f7-2d82b12a10bf\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " Apr 16 18:48:27.947945 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.947803 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv8hc\" (UniqueName: \"kubernetes.io/projected/e964dec8-c773-44cd-88f7-2d82b12a10bf-kube-api-access-cv8hc\") pod \"e964dec8-c773-44cd-88f7-2d82b12a10bf\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " Apr 16 18:48:27.947945 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.947836 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-kserve-provision-location\") pod \"e964dec8-c773-44cd-88f7-2d82b12a10bf\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " Apr 16 18:48:27.947945 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.947865 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-home\") pod \"e964dec8-c773-44cd-88f7-2d82b12a10bf\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " Apr 16 18:48:27.947945 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.947917 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-dshm\") pod \"e964dec8-c773-44cd-88f7-2d82b12a10bf\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " Apr 16 18:48:27.948165 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.947963 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e964dec8-c773-44cd-88f7-2d82b12a10bf-tls-certs\") pod \"e964dec8-c773-44cd-88f7-2d82b12a10bf\" (UID: \"e964dec8-c773-44cd-88f7-2d82b12a10bf\") " Apr 16 18:48:27.948165 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.947969 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-model-cache" (OuterVolumeSpecName: "model-cache") pod "e964dec8-c773-44cd-88f7-2d82b12a10bf" (UID: "e964dec8-c773-44cd-88f7-2d82b12a10bf"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:48:27.948276 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.948232 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:48:27.948632 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.948580 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-home" (OuterVolumeSpecName: "home") pod "e964dec8-c773-44cd-88f7-2d82b12a10bf" (UID: "e964dec8-c773-44cd-88f7-2d82b12a10bf"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:48:27.950665 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.950633 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e964dec8-c773-44cd-88f7-2d82b12a10bf-kube-api-access-cv8hc" (OuterVolumeSpecName: "kube-api-access-cv8hc") pod "e964dec8-c773-44cd-88f7-2d82b12a10bf" (UID: "e964dec8-c773-44cd-88f7-2d82b12a10bf"). InnerVolumeSpecName "kube-api-access-cv8hc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:48:27.951265 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.951230 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e964dec8-c773-44cd-88f7-2d82b12a10bf-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e964dec8-c773-44cd-88f7-2d82b12a10bf" (UID: "e964dec8-c773-44cd-88f7-2d82b12a10bf"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:48:27.951780 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:27.951731 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-dshm" (OuterVolumeSpecName: "dshm") pod "e964dec8-c773-44cd-88f7-2d82b12a10bf" (UID: "e964dec8-c773-44cd-88f7-2d82b12a10bf"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:48:28.014550 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.014508 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e964dec8-c773-44cd-88f7-2d82b12a10bf" (UID: "e964dec8-c773-44cd-88f7-2d82b12a10bf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:48:28.048880 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.048850 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cv8hc\" (UniqueName: \"kubernetes.io/projected/e964dec8-c773-44cd-88f7-2d82b12a10bf-kube-api-access-cv8hc\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:48:28.048880 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.048879 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:48:28.049034 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.048889 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:48:28.049034 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.048900 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e964dec8-c773-44cd-88f7-2d82b12a10bf-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:48:28.049034 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.048908 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e964dec8-c773-44cd-88f7-2d82b12a10bf-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:48:28.691820 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.691797 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-85568b7f4f-jhlxv_e964dec8-c773-44cd-88f7-2d82b12a10bf/main/0.log" Apr 16 18:48:28.692193 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.692169 2576 generic.go:358] "Generic (PLEG): container finished" podID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerID="33ff1e9818803867610e23e97c1d29a25b5de33b8de209927839330c0161fa20" exitCode=137 Apr 16 18:48:28.692280 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.692234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" event={"ID":"e964dec8-c773-44cd-88f7-2d82b12a10bf","Type":"ContainerDied","Data":"33ff1e9818803867610e23e97c1d29a25b5de33b8de209927839330c0161fa20"} Apr 16 18:48:28.692321 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.692272 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" Apr 16 18:48:28.692321 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.692287 2576 scope.go:117] "RemoveContainer" containerID="33ff1e9818803867610e23e97c1d29a25b5de33b8de209927839330c0161fa20" Apr 16 18:48:28.692403 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.692276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv" event={"ID":"e964dec8-c773-44cd-88f7-2d82b12a10bf","Type":"ContainerDied","Data":"c601509dea98cfb8a860e0f47f072d8306a9c666d787aa9610d5d08a4d8fab38"} Apr 16 18:48:28.713669 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.713640 2576 scope.go:117] "RemoveContainer" containerID="66f50e9fb54f6ead8fea1352e9b3a8f31b7b5285555cd45c8f1f8ffba277f58b" Apr 16 18:48:28.722320 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.722290 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv"] Apr 16 18:48:28.724614 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.724583 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85568b7f4f-jhlxv"] Apr 16 18:48:28.782337 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.782311 2576 scope.go:117] "RemoveContainer" containerID="33ff1e9818803867610e23e97c1d29a25b5de33b8de209927839330c0161fa20" Apr 16 18:48:28.782674 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:48:28.782653 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33ff1e9818803867610e23e97c1d29a25b5de33b8de209927839330c0161fa20\": container with ID starting with 33ff1e9818803867610e23e97c1d29a25b5de33b8de209927839330c0161fa20 not found: ID does not exist" containerID="33ff1e9818803867610e23e97c1d29a25b5de33b8de209927839330c0161fa20" Apr 16 18:48:28.782773 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.782691 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ff1e9818803867610e23e97c1d29a25b5de33b8de209927839330c0161fa20"} err="failed to get container status \"33ff1e9818803867610e23e97c1d29a25b5de33b8de209927839330c0161fa20\": rpc error: code = NotFound desc = could not find container \"33ff1e9818803867610e23e97c1d29a25b5de33b8de209927839330c0161fa20\": container with ID starting with 33ff1e9818803867610e23e97c1d29a25b5de33b8de209927839330c0161fa20 not found: ID does not exist" Apr 16 18:48:28.782773 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.782720 2576 scope.go:117] "RemoveContainer" containerID="66f50e9fb54f6ead8fea1352e9b3a8f31b7b5285555cd45c8f1f8ffba277f58b" Apr 16 18:48:28.783027 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:48:28.783003 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66f50e9fb54f6ead8fea1352e9b3a8f31b7b5285555cd45c8f1f8ffba277f58b\": container with ID starting with 66f50e9fb54f6ead8fea1352e9b3a8f31b7b5285555cd45c8f1f8ffba277f58b not found: ID does not exist" containerID="66f50e9fb54f6ead8fea1352e9b3a8f31b7b5285555cd45c8f1f8ffba277f58b" Apr 16 18:48:28.783077 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:28.783033 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f50e9fb54f6ead8fea1352e9b3a8f31b7b5285555cd45c8f1f8ffba277f58b"} err="failed to get container status \"66f50e9fb54f6ead8fea1352e9b3a8f31b7b5285555cd45c8f1f8ffba277f58b\": rpc error: code = NotFound desc = could not find container \"66f50e9fb54f6ead8fea1352e9b3a8f31b7b5285555cd45c8f1f8ffba277f58b\": container with ID starting with 66f50e9fb54f6ead8fea1352e9b3a8f31b7b5285555cd45c8f1f8ffba277f58b not found: ID does not exist" Apr 16 18:48:30.507043 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:30.506998 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" path="/var/lib/kubelet/pods/e964dec8-c773-44cd-88f7-2d82b12a10bf/volumes" Apr 16 18:48:33.100798 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:33.100711 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 18:48:34.112752 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.112695 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-fc44f49f-xtln4"] Apr 16 18:48:34.113194 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.113064 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" podUID="4504c8c4-c68b-49e0-a8e0-67d7445674f9" containerName="manager" containerID="cri-o://bb464e48ad053c98756474189b99f3f835a517d79b0fdfd02f7d30ffb79e3bf9" gracePeriod=30 Apr 16 18:48:34.250596 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.250549 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" podUID="4504c8c4-c68b-49e0-a8e0-67d7445674f9" containerName="manager" probeResult="failure" output="Get \"http://10.134.0.35:8081/readyz\": dial tcp 10.134.0.35:8081: connect: connection refused" Apr 16 18:48:34.363276 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.363194 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" Apr 16 18:48:34.512885 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.512854 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4504c8c4-c68b-49e0-a8e0-67d7445674f9-cert\") pod \"4504c8c4-c68b-49e0-a8e0-67d7445674f9\" (UID: \"4504c8c4-c68b-49e0-a8e0-67d7445674f9\") " Apr 16 18:48:34.513064 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.512966 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtpz2\" (UniqueName: \"kubernetes.io/projected/4504c8c4-c68b-49e0-a8e0-67d7445674f9-kube-api-access-gtpz2\") pod \"4504c8c4-c68b-49e0-a8e0-67d7445674f9\" (UID: \"4504c8c4-c68b-49e0-a8e0-67d7445674f9\") " Apr 16 18:48:34.515159 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.515118 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4504c8c4-c68b-49e0-a8e0-67d7445674f9-cert" (OuterVolumeSpecName: "cert") pod "4504c8c4-c68b-49e0-a8e0-67d7445674f9" (UID: "4504c8c4-c68b-49e0-a8e0-67d7445674f9"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:48:34.515159 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.515142 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4504c8c4-c68b-49e0-a8e0-67d7445674f9-kube-api-access-gtpz2" (OuterVolumeSpecName: "kube-api-access-gtpz2") pod "4504c8c4-c68b-49e0-a8e0-67d7445674f9" (UID: "4504c8c4-c68b-49e0-a8e0-67d7445674f9"). InnerVolumeSpecName "kube-api-access-gtpz2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:48:34.614608 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.614509 2576 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4504c8c4-c68b-49e0-a8e0-67d7445674f9-cert\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:48:34.614608 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.614545 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gtpz2\" (UniqueName: \"kubernetes.io/projected/4504c8c4-c68b-49e0-a8e0-67d7445674f9-kube-api-access-gtpz2\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:48:34.719065 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.719027 2576 generic.go:358] "Generic (PLEG): container finished" podID="4504c8c4-c68b-49e0-a8e0-67d7445674f9" containerID="bb464e48ad053c98756474189b99f3f835a517d79b0fdfd02f7d30ffb79e3bf9" exitCode=0 Apr 16 18:48:34.719249 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.719080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" event={"ID":"4504c8c4-c68b-49e0-a8e0-67d7445674f9","Type":"ContainerDied","Data":"bb464e48ad053c98756474189b99f3f835a517d79b0fdfd02f7d30ffb79e3bf9"} Apr 16 18:48:34.719249 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.719086 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" Apr 16 18:48:34.719249 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.719114 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-fc44f49f-xtln4" event={"ID":"4504c8c4-c68b-49e0-a8e0-67d7445674f9","Type":"ContainerDied","Data":"73ade4c7872643f0be573814de78eb0f832c107339dac707d2878d48ea3ac98c"} Apr 16 18:48:34.719249 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.719130 2576 scope.go:117] "RemoveContainer" containerID="bb464e48ad053c98756474189b99f3f835a517d79b0fdfd02f7d30ffb79e3bf9" Apr 16 18:48:34.728831 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.728808 2576 scope.go:117] "RemoveContainer" containerID="bb464e48ad053c98756474189b99f3f835a517d79b0fdfd02f7d30ffb79e3bf9" Apr 16 18:48:34.729109 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:48:34.729088 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb464e48ad053c98756474189b99f3f835a517d79b0fdfd02f7d30ffb79e3bf9\": container with ID starting with bb464e48ad053c98756474189b99f3f835a517d79b0fdfd02f7d30ffb79e3bf9 not found: ID does not exist" containerID="bb464e48ad053c98756474189b99f3f835a517d79b0fdfd02f7d30ffb79e3bf9" Apr 16 18:48:34.729182 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.729116 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb464e48ad053c98756474189b99f3f835a517d79b0fdfd02f7d30ffb79e3bf9"} err="failed to get container status \"bb464e48ad053c98756474189b99f3f835a517d79b0fdfd02f7d30ffb79e3bf9\": rpc error: code = NotFound desc = could not find container \"bb464e48ad053c98756474189b99f3f835a517d79b0fdfd02f7d30ffb79e3bf9\": container with ID starting with bb464e48ad053c98756474189b99f3f835a517d79b0fdfd02f7d30ffb79e3bf9 not found: ID does not exist" Apr 16 18:48:34.747384 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.747348 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-fc44f49f-xtln4"] Apr 16 18:48:34.754618 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:34.754587 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-fc44f49f-xtln4"] Apr 16 18:48:36.506244 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:36.506201 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4504c8c4-c68b-49e0-a8e0-67d7445674f9" path="/var/lib/kubelet/pods/4504c8c4-c68b-49e0-a8e0-67d7445674f9/volumes" Apr 16 18:48:43.100974 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:43.100927 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 18:48:53.100840 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:48:53.100783 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 18:49:03.100859 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:03.100765 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 18:49:10.841789 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.841736 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6"] Apr 16 18:49:10.842157 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.842115 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" Apr 16 18:49:10.842157 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.842131 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" Apr 16 18:49:10.842157 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.842141 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" Apr 16 18:49:10.842157 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.842147 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" Apr 16 18:49:10.842157 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.842154 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4504c8c4-c68b-49e0-a8e0-67d7445674f9" containerName="manager" Apr 16 18:49:10.842157 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.842160 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4504c8c4-c68b-49e0-a8e0-67d7445674f9" containerName="manager" Apr 16 18:49:10.842370 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.842172 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="storage-initializer" Apr 16 18:49:10.842370 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.842178 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="storage-initializer" Apr 16 18:49:10.842370 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.842191 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="storage-initializer" Apr 16 18:49:10.842370 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.842199 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="storage-initializer" Apr 16 18:49:10.842370 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.842259 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e964dec8-c773-44cd-88f7-2d82b12a10bf" containerName="main" Apr 16 18:49:10.842370 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.842276 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4504c8c4-c68b-49e0-a8e0-67d7445674f9" containerName="manager" Apr 16 18:49:10.842370 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.842286 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="26ce4724-d491-43c5-8c98-df0fd25c7359" containerName="main" Apr 16 18:49:10.844324 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.844306 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:10.846987 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.846960 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 16 18:49:10.861307 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.861274 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6"] Apr 16 18:49:10.940704 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.940651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnl27\" (UniqueName: \"kubernetes.io/projected/a78027f9-c11c-42d4-b5c2-029753f83484-kube-api-access-dnl27\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:10.940940 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.940715 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:10.940940 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.940851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:10.940940 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.940887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a78027f9-c11c-42d4-b5c2-029753f83484-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:10.940940 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.940939 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:10.941166 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:10.941063 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.042275 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.042219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.042275 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.042282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.042535 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.042310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a78027f9-c11c-42d4-b5c2-029753f83484-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.042535 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.042403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.042535 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.042481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.042688 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.042550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnl27\" (UniqueName: \"kubernetes.io/projected/a78027f9-c11c-42d4-b5c2-029753f83484-kube-api-access-dnl27\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.043682 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.042848 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.043682 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.042921 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.043682 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.043220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.045267 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.045242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.045520 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.045485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a78027f9-c11c-42d4-b5c2-029753f83484-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.055321 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.055293 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnl27\" (UniqueName: \"kubernetes.io/projected/a78027f9-c11c-42d4-b5c2-029753f83484-kube-api-access-dnl27\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.154617 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.154506 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:11.335874 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.335823 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6"] Apr 16 18:49:11.339792 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:49:11.339733 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda78027f9_c11c_42d4_b5c2_029753f83484.slice/crio-3b57fe4f0a945db436110b8d40d1372ddb3ab90efb235c60c9b638d2ea7e5d1a WatchSource:0}: Error finding container 3b57fe4f0a945db436110b8d40d1372ddb3ab90efb235c60c9b638d2ea7e5d1a: Status 404 returned error can't find the container with id 3b57fe4f0a945db436110b8d40d1372ddb3ab90efb235c60c9b638d2ea7e5d1a Apr 16 18:49:11.870997 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.870958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" event={"ID":"a78027f9-c11c-42d4-b5c2-029753f83484","Type":"ContainerStarted","Data":"b30656f64555ae5c0028fb0ec069c398106ed98b9cd8e15864caf71ea633be38"} Apr 16 18:49:11.870997 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:11.871001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" event={"ID":"a78027f9-c11c-42d4-b5c2-029753f83484","Type":"ContainerStarted","Data":"3b57fe4f0a945db436110b8d40d1372ddb3ab90efb235c60c9b638d2ea7e5d1a"} Apr 16 18:49:13.120894 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:13.120853 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:49:13.138989 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:13.138951 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:49:15.888060 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:15.888011 2576 generic.go:358] "Generic (PLEG): container finished" podID="a78027f9-c11c-42d4-b5c2-029753f83484" containerID="b30656f64555ae5c0028fb0ec069c398106ed98b9cd8e15864caf71ea633be38" exitCode=0 Apr 16 18:49:15.888433 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:15.888097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" event={"ID":"a78027f9-c11c-42d4-b5c2-029753f83484","Type":"ContainerDied","Data":"b30656f64555ae5c0028fb0ec069c398106ed98b9cd8e15864caf71ea633be38"} Apr 16 18:49:16.894147 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:16.894105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" event={"ID":"a78027f9-c11c-42d4-b5c2-029753f83484","Type":"ContainerStarted","Data":"55fd007168d1a584abd3aa97b56415c24859db757c39af2a8946f321d1186007"} Apr 16 18:49:16.923103 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:16.923038 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podStartSLOduration=6.923016684 podStartE2EDuration="6.923016684s" podCreationTimestamp="2026-04-16 18:49:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:49:16.920006968 +0000 UTC m=+1083.044534631" watchObservedRunningTime="2026-04-16 18:49:16.923016684 +0000 UTC m=+1083.047544348" Apr 16 18:49:21.155068 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:21.155025 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:21.155561 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:21.155154 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:49:21.157025 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:21.156989 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:49:24.671979 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:24.671935 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49"] Apr 16 18:49:24.672461 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:24.672315 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" containerID="cri-o://5147a3a992ef5111f86a4c78e6a27066dd56fa36298b31c97430aae2e03a36bc" gracePeriod=30 Apr 16 18:49:29.874559 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:29.874521 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r"] Apr 16 18:49:29.911577 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:29.911544 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf"] Apr 16 18:49:29.911813 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:29.911718 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:29.918374 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:29.918345 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 18:49:29.919716 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:29.919687 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-bs9kp\"" Apr 16 18:49:29.938987 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:29.938950 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r"] Apr 16 18:49:29.938987 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:29.938981 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf"] Apr 16 18:49:29.939209 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:29.939124 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.021094 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.021050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.021295 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.021102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.021295 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.021130 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.021295 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.021189 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzddw\" (UniqueName: \"kubernetes.io/projected/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-kube-api-access-qzddw\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.021295 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.021215 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.021295 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.021238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.021526 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.021298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.021526 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.021325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.021526 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.021369 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/131857a1-bbae-4961-9aad-dbf5d35f2f7a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.021526 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.021397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.021526 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.021459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk4xs\" (UniqueName: \"kubernetes.io/projected/131857a1-bbae-4961-9aad-dbf5d35f2f7a-kube-api-access-mk4xs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.021526 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.021488 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.122571 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.122534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mk4xs\" (UniqueName: \"kubernetes.io/projected/131857a1-bbae-4961-9aad-dbf5d35f2f7a-kube-api-access-mk4xs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.122571 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.122570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.122853 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.122597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.122853 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.122635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.122853 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.122787 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.123019 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.122919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzddw\" (UniqueName: \"kubernetes.io/projected/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-kube-api-access-qzddw\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.123123 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.123090 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.123252 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.123092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.123252 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.123155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.123252 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.123101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.123252 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.123226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.123676 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.123271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.123676 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.123306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.123676 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.123309 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.123676 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.123375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/131857a1-bbae-4961-9aad-dbf5d35f2f7a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.123676 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.123410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.123676 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.123611 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.123676 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.123660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.126100 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.126012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.126100 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.126046 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.126100 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.126070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.126320 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.126288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/131857a1-bbae-4961-9aad-dbf5d35f2f7a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.132443 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.132412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzddw\" (UniqueName: \"kubernetes.io/projected/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-kube-api-access-qzddw\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.132612 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.132588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk4xs\" (UniqueName: \"kubernetes.io/projected/131857a1-bbae-4961-9aad-dbf5d35f2f7a-kube-api-access-mk4xs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.223306 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.223265 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:30.252261 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.252222 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:30.393920 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.393873 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r"] Apr 16 18:49:30.394315 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:49:30.394289 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod131857a1_bbae_4961_9aad_dbf5d35f2f7a.slice/crio-248a1ced5d5888e3a5066f775873a5759aaea07a391fab1ff8cfe7553d9b810c WatchSource:0}: Error finding container 248a1ced5d5888e3a5066f775873a5759aaea07a391fab1ff8cfe7553d9b810c: Status 404 returned error can't find the container with id 248a1ced5d5888e3a5066f775873a5759aaea07a391fab1ff8cfe7553d9b810c Apr 16 18:49:30.421769 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.421723 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf"] Apr 16 18:49:30.424156 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:49:30.424123 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f5b23b3_20a2_4783_8213_bc6d78b40f1e.slice/crio-9151c495e53f65ffd1fdb9f68020dd4735a4b6ace2208003d1a5b1ca08192a27 WatchSource:0}: Error finding container 9151c495e53f65ffd1fdb9f68020dd4735a4b6ace2208003d1a5b1ca08192a27: Status 404 returned error can't find the container with id 9151c495e53f65ffd1fdb9f68020dd4735a4b6ace2208003d1a5b1ca08192a27 Apr 16 18:49:30.955865 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.955787 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" event={"ID":"3f5b23b3-20a2-4783-8213-bc6d78b40f1e","Type":"ContainerStarted","Data":"7e4829fade3a4652b85490a0c17a85381b1644383ad0b11d9351f19e8165c9bd"} Apr 16 18:49:30.955865 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.955839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" event={"ID":"3f5b23b3-20a2-4783-8213-bc6d78b40f1e","Type":"ContainerStarted","Data":"9151c495e53f65ffd1fdb9f68020dd4735a4b6ace2208003d1a5b1ca08192a27"} Apr 16 18:49:30.957585 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:30.957551 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" event={"ID":"131857a1-bbae-4961-9aad-dbf5d35f2f7a","Type":"ContainerStarted","Data":"248a1ced5d5888e3a5066f775873a5759aaea07a391fab1ff8cfe7553d9b810c"} Apr 16 18:49:31.155257 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:31.155184 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:49:31.964252 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:31.964202 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" event={"ID":"131857a1-bbae-4961-9aad-dbf5d35f2f7a","Type":"ContainerStarted","Data":"b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765"} Apr 16 18:49:32.969821 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:32.969781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" event={"ID":"131857a1-bbae-4961-9aad-dbf5d35f2f7a","Type":"ContainerStarted","Data":"379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46"} Apr 16 18:49:32.970217 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:32.970034 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:35.987357 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:35.987321 2576 generic.go:358] "Generic (PLEG): container finished" podID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerID="7e4829fade3a4652b85490a0c17a85381b1644383ad0b11d9351f19e8165c9bd" exitCode=0 Apr 16 18:49:35.987920 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:35.987392 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" event={"ID":"3f5b23b3-20a2-4783-8213-bc6d78b40f1e","Type":"ContainerDied","Data":"7e4829fade3a4652b85490a0c17a85381b1644383ad0b11d9351f19e8165c9bd"} Apr 16 18:49:36.994274 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:36.994225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" event={"ID":"3f5b23b3-20a2-4783-8213-bc6d78b40f1e","Type":"ContainerStarted","Data":"65aa909332d6e80a3cc4985043643fc37a0838468c0d92bb464427ec21f95e1f"} Apr 16 18:49:36.996239 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:36.996212 2576 generic.go:358] "Generic (PLEG): container finished" podID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerID="379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46" exitCode=0 Apr 16 18:49:36.996380 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:36.996285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" event={"ID":"131857a1-bbae-4961-9aad-dbf5d35f2f7a","Type":"ContainerDied","Data":"379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46"} Apr 16 18:49:37.016695 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:37.016630 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podStartSLOduration=8.016609462 podStartE2EDuration="8.016609462s" podCreationTimestamp="2026-04-16 18:49:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:49:37.012841622 +0000 UTC m=+1103.137369286" watchObservedRunningTime="2026-04-16 18:49:37.016609462 +0000 UTC m=+1103.141137126" Apr 16 18:49:38.003578 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:38.003545 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" event={"ID":"131857a1-bbae-4961-9aad-dbf5d35f2f7a","Type":"ContainerStarted","Data":"a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08"} Apr 16 18:49:38.027296 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:38.027234 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podStartSLOduration=7.6441024429999995 podStartE2EDuration="9.027213428s" podCreationTimestamp="2026-04-16 18:49:29 +0000 UTC" firstStartedPulling="2026-04-16 18:49:30.396423339 +0000 UTC m=+1096.520950996" lastFinishedPulling="2026-04-16 18:49:31.779534336 +0000 UTC m=+1097.904061981" observedRunningTime="2026-04-16 18:49:38.026140903 +0000 UTC m=+1104.150668611" watchObservedRunningTime="2026-04-16 18:49:38.027213428 +0000 UTC m=+1104.151741092" Apr 16 18:49:40.223859 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:40.223806 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:40.223859 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:40.223860 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:40.225584 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:40.225543 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:49:40.253186 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:40.253127 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:40.253362 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:40.253204 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:49:40.254852 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:40.254818 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:49:41.156021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:41.155967 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:49:50.224681 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:50.224355 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:49:50.254073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:50.254025 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:49:50.254461 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:50.254437 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:49:51.155963 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:51.155914 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:49:55.049914 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.049805 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-9bbdcd449-9kx49_aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6/main/0.log" Apr 16 18:49:55.050892 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.050869 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:49:55.090286 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.090254 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-9bbdcd449-9kx49_aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6/main/0.log" Apr 16 18:49:55.090887 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.090844 2576 generic.go:358] "Generic (PLEG): container finished" podID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerID="5147a3a992ef5111f86a4c78e6a27066dd56fa36298b31c97430aae2e03a36bc" exitCode=137 Apr 16 18:49:55.091046 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.091005 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" event={"ID":"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6","Type":"ContainerDied","Data":"5147a3a992ef5111f86a4c78e6a27066dd56fa36298b31c97430aae2e03a36bc"} Apr 16 18:49:55.091046 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.091037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" event={"ID":"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6","Type":"ContainerDied","Data":"145e2128dcb748ade3ee1110a35ef38f39a1e72d8c0fe72b91510127bb858a85"} Apr 16 18:49:55.091178 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.091058 2576 scope.go:117] "RemoveContainer" containerID="5147a3a992ef5111f86a4c78e6a27066dd56fa36298b31c97430aae2e03a36bc" Apr 16 18:49:55.091395 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.091374 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49" Apr 16 18:49:55.137115 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.137070 2576 scope.go:117] "RemoveContainer" containerID="6dead1401aeda2a563153917d0b7ab9800c6f30504f1c43df0887ab6087336d6" Apr 16 18:49:55.165578 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.165539 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-kserve-provision-location\") pod \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " Apr 16 18:49:55.165813 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.165634 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-tls-certs\") pod \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " Apr 16 18:49:55.166141 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.166117 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-dshm\") pod \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " Apr 16 18:49:55.166513 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.166494 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-home\") pod \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " Apr 16 18:49:55.166974 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.166943 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-home" (OuterVolumeSpecName: "home") pod "aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" (UID: "aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:49:55.167130 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.167112 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8qwr\" (UniqueName: \"kubernetes.io/projected/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-kube-api-access-h8qwr\") pod \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " Apr 16 18:49:55.167347 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.167332 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-model-cache\") pod \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\" (UID: \"aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6\") " Apr 16 18:49:55.167760 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.167643 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-model-cache" (OuterVolumeSpecName: "model-cache") pod "aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" (UID: "aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:49:55.168133 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.168115 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:49:55.169499 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.169340 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:49:55.170134 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.170096 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-dshm" (OuterVolumeSpecName: "dshm") pod "aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" (UID: "aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:49:55.170389 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.170346 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" (UID: "aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:49:55.171858 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.171834 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-kube-api-access-h8qwr" (OuterVolumeSpecName: "kube-api-access-h8qwr") pod "aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" (UID: "aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6"). InnerVolumeSpecName "kube-api-access-h8qwr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:49:55.246027 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.245971 2576 scope.go:117] "RemoveContainer" containerID="5147a3a992ef5111f86a4c78e6a27066dd56fa36298b31c97430aae2e03a36bc" Apr 16 18:49:55.246447 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:49:55.246416 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5147a3a992ef5111f86a4c78e6a27066dd56fa36298b31c97430aae2e03a36bc\": container with ID starting with 5147a3a992ef5111f86a4c78e6a27066dd56fa36298b31c97430aae2e03a36bc not found: ID does not exist" containerID="5147a3a992ef5111f86a4c78e6a27066dd56fa36298b31c97430aae2e03a36bc" Apr 16 18:49:55.246629 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.246590 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5147a3a992ef5111f86a4c78e6a27066dd56fa36298b31c97430aae2e03a36bc"} err="failed to get container status \"5147a3a992ef5111f86a4c78e6a27066dd56fa36298b31c97430aae2e03a36bc\": rpc error: code = NotFound desc = could not find container \"5147a3a992ef5111f86a4c78e6a27066dd56fa36298b31c97430aae2e03a36bc\": container with ID starting with 5147a3a992ef5111f86a4c78e6a27066dd56fa36298b31c97430aae2e03a36bc not found: ID does not exist" Apr 16 18:49:55.246629 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.246627 2576 scope.go:117] "RemoveContainer" containerID="6dead1401aeda2a563153917d0b7ab9800c6f30504f1c43df0887ab6087336d6" Apr 16 18:49:55.247002 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:49:55.246974 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dead1401aeda2a563153917d0b7ab9800c6f30504f1c43df0887ab6087336d6\": container with ID starting with 6dead1401aeda2a563153917d0b7ab9800c6f30504f1c43df0887ab6087336d6 not found: ID does not exist" containerID="6dead1401aeda2a563153917d0b7ab9800c6f30504f1c43df0887ab6087336d6" Apr 16 18:49:55.247116 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.247010 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dead1401aeda2a563153917d0b7ab9800c6f30504f1c43df0887ab6087336d6"} err="failed to get container status \"6dead1401aeda2a563153917d0b7ab9800c6f30504f1c43df0887ab6087336d6\": rpc error: code = NotFound desc = could not find container \"6dead1401aeda2a563153917d0b7ab9800c6f30504f1c43df0887ab6087336d6\": container with ID starting with 6dead1401aeda2a563153917d0b7ab9800c6f30504f1c43df0887ab6087336d6 not found: ID does not exist" Apr 16 18:49:55.257825 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.257289 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" (UID: "aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:49:55.270766 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.270551 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:49:55.270766 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.270592 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:49:55.270766 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.270608 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h8qwr\" (UniqueName: \"kubernetes.io/projected/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-kube-api-access-h8qwr\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:49:55.270766 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.270624 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:49:55.430925 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.430882 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49"] Apr 16 18:49:55.435443 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:55.435401 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-9bbdcd449-9kx49"] Apr 16 18:49:56.509146 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:49:56.509097 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" path="/var/lib/kubelet/pods/aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6/volumes" Apr 16 18:50:00.224659 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:00.224606 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:50:00.253322 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:00.253261 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:50:01.155706 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:01.155658 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:50:10.223789 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:10.223688 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:50:10.253391 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:10.253344 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:50:11.156014 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:11.155957 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:50:20.225015 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:20.224608 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:50:20.253518 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:20.253468 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:50:21.155371 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:21.155314 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:50:30.224444 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:30.224375 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:50:30.253666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:30.253617 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:50:31.155083 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:31.155024 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:50:40.223852 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:40.223729 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:50:40.253376 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:40.253323 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:50:41.155738 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:41.155686 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:50:50.224655 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:50.224594 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:50:50.252961 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:50.252917 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:50:51.155121 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:50:51.155063 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:51:00.223850 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:00.223788 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:51:00.253087 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:00.253049 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:51:01.154967 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:01.154913 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:51:10.224191 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:10.224141 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:51:10.253489 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:10.253449 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:51:11.155385 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:11.155326 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:51:14.510894 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:14.510865 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/2.log" Apr 16 18:51:14.515952 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:14.515927 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/2.log" Apr 16 18:51:20.224653 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:20.224604 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:51:20.253574 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:20.253515 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:51:21.155367 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:21.155318 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:51:30.224372 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:30.224303 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:51:30.253472 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:30.253433 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:51:31.155448 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:31.155400 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:51:40.224619 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:40.224568 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:51:40.253338 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:40.253290 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:51:41.155229 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:41.155180 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 18:51:50.224199 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:50.224148 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:51:50.253563 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:50.253512 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:51:51.165074 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:51.165033 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:51:51.173170 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:51:51.173136 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:52:00.223863 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:00.223815 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:52:00.253614 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:00.253575 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:52:03.180844 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:03.180735 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6"] Apr 16 18:52:03.181286 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:03.181072 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" containerID="cri-o://55fd007168d1a584abd3aa97b56415c24859db757c39af2a8946f321d1186007" gracePeriod=30 Apr 16 18:52:10.223827 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:10.223771 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:52:10.253346 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:10.253295 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:52:13.002835 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.002800 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:52:13.003222 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.003154 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" Apr 16 18:52:13.003222 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.003165 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" Apr 16 18:52:13.003222 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.003190 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="storage-initializer" Apr 16 18:52:13.003222 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.003195 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="storage-initializer" Apr 16 18:52:13.003355 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.003249 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa5ee4f1-adeb-4a17-8a24-2f6125ff44a6" containerName="main" Apr 16 18:52:13.007060 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.007041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.010495 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.010462 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 18:52:13.010613 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.010462 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-qqmnd\"" Apr 16 18:52:13.017822 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.017792 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:52:13.043193 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.043147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.043356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.043264 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.043356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.043331 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.043444 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.043390 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5k6c\" (UniqueName: \"kubernetes.io/projected/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-kube-api-access-q5k6c\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.043444 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.043433 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.043539 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.043493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.061335 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.061300 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:52:13.065772 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.065754 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.073902 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.073875 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:52:13.144286 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144246 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.144486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.144486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.144486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144371 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.144486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.144486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.144813 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144502 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.144813 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.144813 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5k6c\" (UniqueName: \"kubernetes.io/projected/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-kube-api-access-q5k6c\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.144813 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.144813 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71e8f634-277f-41a1-9c62-78279d199716-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.144813 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2fmp\" (UniqueName: \"kubernetes.io/projected/71e8f634-277f-41a1-9c62-78279d199716-kube-api-access-b2fmp\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.144813 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144732 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.144813 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144801 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.145204 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.144985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.146724 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.146693 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.146930 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.146915 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.156380 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.156353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5k6c\" (UniqueName: \"kubernetes.io/projected/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-kube-api-access-q5k6c\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.245916 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.245877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71e8f634-277f-41a1-9c62-78279d199716-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.245916 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.245915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2fmp\" (UniqueName: \"kubernetes.io/projected/71e8f634-277f-41a1-9c62-78279d199716-kube-api-access-b2fmp\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.246180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.245952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.246180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.245985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.246180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.246029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.246180 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.246058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.246404 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.246371 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.246458 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.246419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.246524 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.246495 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.248397 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.248375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.248509 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.248491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71e8f634-277f-41a1-9c62-78279d199716-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.254825 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.254734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2fmp\" (UniqueName: \"kubernetes.io/projected/71e8f634-277f-41a1-9c62-78279d199716-kube-api-access-b2fmp\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.320082 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.320041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:13.378775 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.378712 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:52:13.472849 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.472765 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:52:13.474262 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:52:13.474206 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e5697f0_bd88_45a8_a73b_2b20ff90bd29.slice/crio-802ef8bfdb4d30fc010b988457be4193834f591bc79858c9bb21d3d396122396 WatchSource:0}: Error finding container 802ef8bfdb4d30fc010b988457be4193834f591bc79858c9bb21d3d396122396: Status 404 returned error can't find the container with id 802ef8bfdb4d30fc010b988457be4193834f591bc79858c9bb21d3d396122396 Apr 16 18:52:13.530270 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.530218 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:52:13.538952 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:52:13.538919 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71e8f634_277f_41a1_9c62_78279d199716.slice/crio-86d7aa5361092873fe5834ef59bec53dadb1642bb37cf5ff93a6b112ed094473 WatchSource:0}: Error finding container 86d7aa5361092873fe5834ef59bec53dadb1642bb37cf5ff93a6b112ed094473: Status 404 returned error can't find the container with id 86d7aa5361092873fe5834ef59bec53dadb1642bb37cf5ff93a6b112ed094473 Apr 16 18:52:13.717780 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.717707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"71e8f634-277f-41a1-9c62-78279d199716","Type":"ContainerStarted","Data":"787d43c0b22a235203bcd35fb634d83e6cbfc28ec0628a7dc9322d46dc289fb2"} Apr 16 18:52:13.717780 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.717784 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"71e8f634-277f-41a1-9c62-78279d199716","Type":"ContainerStarted","Data":"86d7aa5361092873fe5834ef59bec53dadb1642bb37cf5ff93a6b112ed094473"} Apr 16 18:52:13.719482 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.719441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"8e5697f0-bd88-45a8-a73b-2b20ff90bd29","Type":"ContainerStarted","Data":"600833f62324e0968b744b8522d97b5ec7872fd5381940aadacaa759850986a7"} Apr 16 18:52:13.719482 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:13.719475 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"8e5697f0-bd88-45a8-a73b-2b20ff90bd29","Type":"ContainerStarted","Data":"802ef8bfdb4d30fc010b988457be4193834f591bc79858c9bb21d3d396122396"} Apr 16 18:52:18.745140 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:18.745100 2576 generic.go:358] "Generic (PLEG): container finished" podID="71e8f634-277f-41a1-9c62-78279d199716" containerID="787d43c0b22a235203bcd35fb634d83e6cbfc28ec0628a7dc9322d46dc289fb2" exitCode=0 Apr 16 18:52:18.745700 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:18.745196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"71e8f634-277f-41a1-9c62-78279d199716","Type":"ContainerDied","Data":"787d43c0b22a235203bcd35fb634d83e6cbfc28ec0628a7dc9322d46dc289fb2"} Apr 16 18:52:18.746711 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:18.746687 2576 generic.go:358] "Generic (PLEG): container finished" podID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerID="600833f62324e0968b744b8522d97b5ec7872fd5381940aadacaa759850986a7" exitCode=0 Apr 16 18:52:18.746838 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:18.746778 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"8e5697f0-bd88-45a8-a73b-2b20ff90bd29","Type":"ContainerDied","Data":"600833f62324e0968b744b8522d97b5ec7872fd5381940aadacaa759850986a7"} Apr 16 18:52:19.753985 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:19.753935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"71e8f634-277f-41a1-9c62-78279d199716","Type":"ContainerStarted","Data":"5a99dc081a5cadc239b43ade7cc4e3176772f82abe9bf180220e6c12e029d50e"} Apr 16 18:52:19.756201 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:19.756175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"8e5697f0-bd88-45a8-a73b-2b20ff90bd29","Type":"ContainerStarted","Data":"8c4ce71fdad3ccf6521fbe2aebbaae7ed22a7435d0ac4dff0bd9fdbac975628b"} Apr 16 18:52:19.772115 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:19.772047 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.77202528 podStartE2EDuration="6.77202528s" podCreationTimestamp="2026-04-16 18:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:52:19.771113381 +0000 UTC m=+1265.895641054" watchObservedRunningTime="2026-04-16 18:52:19.77202528 +0000 UTC m=+1265.896552952" Apr 16 18:52:19.789588 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:19.789527 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=7.78951048 podStartE2EDuration="7.78951048s" podCreationTimestamp="2026-04-16 18:52:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:52:19.787939038 +0000 UTC m=+1265.912466717" watchObservedRunningTime="2026-04-16 18:52:19.78951048 +0000 UTC m=+1265.914038144" Apr 16 18:52:20.223976 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:20.223925 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 16 18:52:20.253615 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:20.253561 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 18:52:23.320426 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:23.320377 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:23.322396 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:23.322361 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:52:30.234837 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:30.234797 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:52:30.256734 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:30.256698 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:52:30.265849 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:30.265816 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:52:30.273871 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:30.273836 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:52:33.320996 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.320952 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:52:33.557862 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.557832 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6_a78027f9-c11c-42d4-b5c2-029753f83484/main/0.log" Apr 16 18:52:33.558302 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.558277 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:52:33.656696 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.656656 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-home\") pod \"a78027f9-c11c-42d4-b5c2-029753f83484\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " Apr 16 18:52:33.656896 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.656716 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnl27\" (UniqueName: \"kubernetes.io/projected/a78027f9-c11c-42d4-b5c2-029753f83484-kube-api-access-dnl27\") pod \"a78027f9-c11c-42d4-b5c2-029753f83484\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " Apr 16 18:52:33.656896 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.656845 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-dshm\") pod \"a78027f9-c11c-42d4-b5c2-029753f83484\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " Apr 16 18:52:33.656971 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.656901 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-kserve-provision-location\") pod \"a78027f9-c11c-42d4-b5c2-029753f83484\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " Apr 16 18:52:33.656971 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.656932 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a78027f9-c11c-42d4-b5c2-029753f83484-tls-certs\") pod \"a78027f9-c11c-42d4-b5c2-029753f83484\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " Apr 16 18:52:33.657058 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.656985 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-model-cache\") pod \"a78027f9-c11c-42d4-b5c2-029753f83484\" (UID: \"a78027f9-c11c-42d4-b5c2-029753f83484\") " Apr 16 18:52:33.657114 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.657083 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-home" (OuterVolumeSpecName: "home") pod "a78027f9-c11c-42d4-b5c2-029753f83484" (UID: "a78027f9-c11c-42d4-b5c2-029753f83484"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:33.657303 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.657272 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-model-cache" (OuterVolumeSpecName: "model-cache") pod "a78027f9-c11c-42d4-b5c2-029753f83484" (UID: "a78027f9-c11c-42d4-b5c2-029753f83484"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:33.657422 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.657362 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:52:33.657422 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.657380 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:52:33.659600 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.659519 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78027f9-c11c-42d4-b5c2-029753f83484-kube-api-access-dnl27" (OuterVolumeSpecName: "kube-api-access-dnl27") pod "a78027f9-c11c-42d4-b5c2-029753f83484" (UID: "a78027f9-c11c-42d4-b5c2-029753f83484"). InnerVolumeSpecName "kube-api-access-dnl27". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:52:33.660726 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.660701 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78027f9-c11c-42d4-b5c2-029753f83484-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a78027f9-c11c-42d4-b5c2-029753f83484" (UID: "a78027f9-c11c-42d4-b5c2-029753f83484"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:52:33.661404 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.661367 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-dshm" (OuterVolumeSpecName: "dshm") pod "a78027f9-c11c-42d4-b5c2-029753f83484" (UID: "a78027f9-c11c-42d4-b5c2-029753f83484"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:33.709327 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.709281 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a78027f9-c11c-42d4-b5c2-029753f83484" (UID: "a78027f9-c11c-42d4-b5c2-029753f83484"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:33.758226 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.758165 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dnl27\" (UniqueName: \"kubernetes.io/projected/a78027f9-c11c-42d4-b5c2-029753f83484-kube-api-access-dnl27\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:52:33.758226 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.758218 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:52:33.758473 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.758238 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a78027f9-c11c-42d4-b5c2-029753f83484-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:52:33.758473 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.758255 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a78027f9-c11c-42d4-b5c2-029753f83484-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:52:33.829391 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.829298 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6_a78027f9-c11c-42d4-b5c2-029753f83484/main/0.log" Apr 16 18:52:33.829727 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.829701 2576 generic.go:358] "Generic (PLEG): container finished" podID="a78027f9-c11c-42d4-b5c2-029753f83484" containerID="55fd007168d1a584abd3aa97b56415c24859db757c39af2a8946f321d1186007" exitCode=137 Apr 16 18:52:33.829850 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.829796 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" event={"ID":"a78027f9-c11c-42d4-b5c2-029753f83484","Type":"ContainerDied","Data":"55fd007168d1a584abd3aa97b56415c24859db757c39af2a8946f321d1186007"} Apr 16 18:52:33.829850 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.829818 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" Apr 16 18:52:33.829850 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.829837 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6" event={"ID":"a78027f9-c11c-42d4-b5c2-029753f83484","Type":"ContainerDied","Data":"3b57fe4f0a945db436110b8d40d1372ddb3ab90efb235c60c9b638d2ea7e5d1a"} Apr 16 18:52:33.830011 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.829856 2576 scope.go:117] "RemoveContainer" containerID="55fd007168d1a584abd3aa97b56415c24859db757c39af2a8946f321d1186007" Apr 16 18:52:33.859002 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.858565 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6"] Apr 16 18:52:33.859002 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.858760 2576 scope.go:117] "RemoveContainer" containerID="b30656f64555ae5c0028fb0ec069c398106ed98b9cd8e15864caf71ea633be38" Apr 16 18:52:33.863437 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.863410 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-667d7bdbbcqwjt6"] Apr 16 18:52:33.915843 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.915798 2576 scope.go:117] "RemoveContainer" containerID="55fd007168d1a584abd3aa97b56415c24859db757c39af2a8946f321d1186007" Apr 16 18:52:33.916226 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:52:33.916202 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55fd007168d1a584abd3aa97b56415c24859db757c39af2a8946f321d1186007\": container with ID starting with 55fd007168d1a584abd3aa97b56415c24859db757c39af2a8946f321d1186007 not found: ID does not exist" containerID="55fd007168d1a584abd3aa97b56415c24859db757c39af2a8946f321d1186007" Apr 16 18:52:33.916357 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.916234 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55fd007168d1a584abd3aa97b56415c24859db757c39af2a8946f321d1186007"} err="failed to get container status \"55fd007168d1a584abd3aa97b56415c24859db757c39af2a8946f321d1186007\": rpc error: code = NotFound desc = could not find container \"55fd007168d1a584abd3aa97b56415c24859db757c39af2a8946f321d1186007\": container with ID starting with 55fd007168d1a584abd3aa97b56415c24859db757c39af2a8946f321d1186007 not found: ID does not exist" Apr 16 18:52:33.916357 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.916254 2576 scope.go:117] "RemoveContainer" containerID="b30656f64555ae5c0028fb0ec069c398106ed98b9cd8e15864caf71ea633be38" Apr 16 18:52:33.916545 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:52:33.916510 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b30656f64555ae5c0028fb0ec069c398106ed98b9cd8e15864caf71ea633be38\": container with ID starting with b30656f64555ae5c0028fb0ec069c398106ed98b9cd8e15864caf71ea633be38 not found: ID does not exist" containerID="b30656f64555ae5c0028fb0ec069c398106ed98b9cd8e15864caf71ea633be38" Apr 16 18:52:33.916622 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:33.916567 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b30656f64555ae5c0028fb0ec069c398106ed98b9cd8e15864caf71ea633be38"} err="failed to get container status \"b30656f64555ae5c0028fb0ec069c398106ed98b9cd8e15864caf71ea633be38\": rpc error: code = NotFound desc = could not find container \"b30656f64555ae5c0028fb0ec069c398106ed98b9cd8e15864caf71ea633be38\": container with ID starting with b30656f64555ae5c0028fb0ec069c398106ed98b9cd8e15864caf71ea633be38 not found: ID does not exist" Apr 16 18:52:34.508977 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:34.508927 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" path="/var/lib/kubelet/pods/a78027f9-c11c-42d4-b5c2-029753f83484/volumes" Apr 16 18:52:43.320685 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:43.320646 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:52:43.321288 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:43.321138 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:52:43.361453 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:43.361415 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r"] Apr 16 18:52:43.361904 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:43.361863 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" containerID="cri-o://a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08" gracePeriod=30 Apr 16 18:52:43.371500 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:43.371465 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf"] Apr 16 18:52:43.371785 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:43.371758 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" containerID="cri-o://65aa909332d6e80a3cc4985043643fc37a0838468c0d92bb464427ec21f95e1f" gracePeriod=30 Apr 16 18:52:53.320991 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:52:53.320949 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:53:03.321350 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:03.321304 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:53:04.736697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.736664 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk"] Apr 16 18:53:04.737236 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.737216 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" Apr 16 18:53:04.737319 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.737239 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" Apr 16 18:53:04.737319 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.737260 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="storage-initializer" Apr 16 18:53:04.737319 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.737269 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="storage-initializer" Apr 16 18:53:04.737482 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.737367 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a78027f9-c11c-42d4-b5c2-029753f83484" containerName="main" Apr 16 18:53:04.742305 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.742284 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.744944 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.744921 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-4r4pr\"" Apr 16 18:53:04.745121 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.745103 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 18:53:04.750430 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.750404 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk"] Apr 16 18:53:04.761933 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.761890 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t"] Apr 16 18:53:04.767459 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.766920 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.775575 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.775546 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t"] Apr 16 18:53:04.859171 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.859124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.859373 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.859234 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.859373 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.859274 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.859373 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.859322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.859504 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.859368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-home\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.859504 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.859394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts4nd\" (UniqueName: \"kubernetes.io/projected/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-kube-api-access-ts4nd\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.859504 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.859475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.859631 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.859509 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdrkz\" (UniqueName: \"kubernetes.io/projected/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-kube-api-access-pdrkz\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.859631 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.859549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.859631 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.859569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.859631 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.859584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-dshm\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.859846 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.859676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-model-cache\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.961183 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-model-cache\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.961403 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.961403 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.961403 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.961403 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.961403 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-home\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.961403 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts4nd\" (UniqueName: \"kubernetes.io/projected/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-kube-api-access-ts4nd\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.961719 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.961719 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdrkz\" (UniqueName: \"kubernetes.io/projected/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-kube-api-access-pdrkz\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.961719 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.961719 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.961719 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-dshm\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.961719 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961636 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-model-cache\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.961719 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.961719 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.962163 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.961995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.962163 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.962055 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-home\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.962598 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.962327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.964504 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.964476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-dshm\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.964640 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.964481 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.964774 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.964709 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:04.964923 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.964903 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.977536 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.977493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts4nd\" (UniqueName: \"kubernetes.io/projected/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-kube-api-access-ts4nd\") pod \"custom-route-timeout-pd-test-kserve-996756845-ljdqk\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:04.981029 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:04.980966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdrkz\" (UniqueName: \"kubernetes.io/projected/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-kube-api-access-pdrkz\") pod \"custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:05.055197 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:05.055092 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:05.085047 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:05.085009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:05.263627 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:05.263094 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk"] Apr 16 18:53:05.264373 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:53:05.264331 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2aca90b_c1f9_4902_bc21_d2d6fdb992b7.slice/crio-5d404b1d631a95f8515dddd51341e425e6f40265bea965185759fc762b45341a WatchSource:0}: Error finding container 5d404b1d631a95f8515dddd51341e425e6f40265bea965185759fc762b45341a: Status 404 returned error can't find the container with id 5d404b1d631a95f8515dddd51341e425e6f40265bea965185759fc762b45341a Apr 16 18:53:05.266936 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:05.266910 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:53:05.306868 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:05.306841 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t"] Apr 16 18:53:05.308943 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:53:05.308913 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1ac7c66_cb5a_4c19_b56c_8e797b883e5c.slice/crio-b9a5bb598c096fcbe37130739f1802424004f95949cb411289cb9d14eabc32cd WatchSource:0}: Error finding container b9a5bb598c096fcbe37130739f1802424004f95949cb411289cb9d14eabc32cd: Status 404 returned error can't find the container with id b9a5bb598c096fcbe37130739f1802424004f95949cb411289cb9d14eabc32cd Apr 16 18:53:05.993640 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:05.993597 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" event={"ID":"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7","Type":"ContainerStarted","Data":"c7c9531344e9629619fc8783aed021702cc9fe1ba7592c8011080106c4d8c52b"} Apr 16 18:53:05.993640 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:05.993647 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" event={"ID":"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7","Type":"ContainerStarted","Data":"5d404b1d631a95f8515dddd51341e425e6f40265bea965185759fc762b45341a"} Apr 16 18:53:05.994251 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:05.993702 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:05.995331 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:05.995306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" event={"ID":"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c","Type":"ContainerStarted","Data":"c835358d5140a8b00b0fa20d0ebda004ff67634e0ce17902c1be153279aad22d"} Apr 16 18:53:05.995416 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:05.995342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" event={"ID":"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c","Type":"ContainerStarted","Data":"b9a5bb598c096fcbe37130739f1802424004f95949cb411289cb9d14eabc32cd"} Apr 16 18:53:07.005255 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:07.005212 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" event={"ID":"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7","Type":"ContainerStarted","Data":"36e1fd131f1db01e0e74404c8960937bc7c31f2772a15d28f2618cacfcfcfd56"} Apr 16 18:53:10.027757 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:10.027698 2576 generic.go:358] "Generic (PLEG): container finished" podID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerID="c835358d5140a8b00b0fa20d0ebda004ff67634e0ce17902c1be153279aad22d" exitCode=0 Apr 16 18:53:10.028275 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:10.027779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" event={"ID":"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c","Type":"ContainerDied","Data":"c835358d5140a8b00b0fa20d0ebda004ff67634e0ce17902c1be153279aad22d"} Apr 16 18:53:11.037323 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:11.037233 2576 generic.go:358] "Generic (PLEG): container finished" podID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerID="36e1fd131f1db01e0e74404c8960937bc7c31f2772a15d28f2618cacfcfcfd56" exitCode=0 Apr 16 18:53:11.038061 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:11.038034 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" event={"ID":"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7","Type":"ContainerDied","Data":"36e1fd131f1db01e0e74404c8960937bc7c31f2772a15d28f2618cacfcfcfd56"} Apr 16 18:53:11.042396 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:11.041724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" event={"ID":"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c","Type":"ContainerStarted","Data":"5cab1c4239a39c1cce4e58bdff9b71adc09e0e9a8d530e0f23579ab4849f7d8b"} Apr 16 18:53:11.079694 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:11.079636 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podStartSLOduration=7.079617857 podStartE2EDuration="7.079617857s" podCreationTimestamp="2026-04-16 18:53:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:53:11.075549911 +0000 UTC m=+1317.200077578" watchObservedRunningTime="2026-04-16 18:53:11.079617857 +0000 UTC m=+1317.204145521" Apr 16 18:53:12.049923 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:12.049879 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" event={"ID":"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7","Type":"ContainerStarted","Data":"6fa46587cc664b3a0d7a9e85bc8437e7e5e60477666b9f6723979a3579955a10"} Apr 16 18:53:12.076189 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:12.076122 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podStartSLOduration=8.07610242 podStartE2EDuration="8.07610242s" podCreationTimestamp="2026-04-16 18:53:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:53:12.07219009 +0000 UTC m=+1318.196717756" watchObservedRunningTime="2026-04-16 18:53:12.07610242 +0000 UTC m=+1318.200630084" Apr 16 18:53:13.321545 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.321496 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:53:13.362216 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.362148 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="llm-d-routing-sidecar" containerID="cri-o://b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765" gracePeriod=2 Apr 16 18:53:13.865303 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.865272 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:53:13.870029 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.869997 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r_131857a1-bbae-4961-9aad-dbf5d35f2f7a/main/0.log" Apr 16 18:53:13.871023 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.871002 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:53:13.964569 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.964518 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-kserve-provision-location\") pod \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " Apr 16 18:53:13.964799 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.964611 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-dshm\") pod \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " Apr 16 18:53:13.964799 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.964657 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-kserve-provision-location\") pod \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " Apr 16 18:53:13.964799 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.964698 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-home\") pod \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " Apr 16 18:53:13.964799 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.964735 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-model-cache\") pod \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " Apr 16 18:53:13.965042 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.964811 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk4xs\" (UniqueName: \"kubernetes.io/projected/131857a1-bbae-4961-9aad-dbf5d35f2f7a-kube-api-access-mk4xs\") pod \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " Apr 16 18:53:13.965042 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.964839 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-model-cache\") pod \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " Apr 16 18:53:13.965042 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.964865 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/131857a1-bbae-4961-9aad-dbf5d35f2f7a-tls-certs\") pod \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " Apr 16 18:53:13.965042 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.964925 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzddw\" (UniqueName: \"kubernetes.io/projected/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-kube-api-access-qzddw\") pod \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " Apr 16 18:53:13.965042 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.964948 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-tls-certs\") pod \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " Apr 16 18:53:13.965042 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.964976 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-home\") pod \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\" (UID: \"131857a1-bbae-4961-9aad-dbf5d35f2f7a\") " Apr 16 18:53:13.965042 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.965035 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-dshm\") pod \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\" (UID: \"3f5b23b3-20a2-4783-8213-bc6d78b40f1e\") " Apr 16 18:53:13.965392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.965094 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-home" (OuterVolumeSpecName: "home") pod "3f5b23b3-20a2-4783-8213-bc6d78b40f1e" (UID: "3f5b23b3-20a2-4783-8213-bc6d78b40f1e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:53:13.965392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.965372 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:53:13.968309 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.968113 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-dshm" (OuterVolumeSpecName: "dshm") pod "3f5b23b3-20a2-4783-8213-bc6d78b40f1e" (UID: "3f5b23b3-20a2-4783-8213-bc6d78b40f1e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:53:13.968643 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.968473 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-model-cache" (OuterVolumeSpecName: "model-cache") pod "3f5b23b3-20a2-4783-8213-bc6d78b40f1e" (UID: "3f5b23b3-20a2-4783-8213-bc6d78b40f1e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:53:13.968826 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.968781 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-model-cache" (OuterVolumeSpecName: "model-cache") pod "131857a1-bbae-4961-9aad-dbf5d35f2f7a" (UID: "131857a1-bbae-4961-9aad-dbf5d35f2f7a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:53:13.969321 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.969278 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-kube-api-access-qzddw" (OuterVolumeSpecName: "kube-api-access-qzddw") pod "3f5b23b3-20a2-4783-8213-bc6d78b40f1e" (UID: "3f5b23b3-20a2-4783-8213-bc6d78b40f1e"). InnerVolumeSpecName "kube-api-access-qzddw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:53:13.969823 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.969775 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3f5b23b3-20a2-4783-8213-bc6d78b40f1e" (UID: "3f5b23b3-20a2-4783-8213-bc6d78b40f1e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:53:13.970258 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.970233 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-home" (OuterVolumeSpecName: "home") pod "131857a1-bbae-4961-9aad-dbf5d35f2f7a" (UID: "131857a1-bbae-4961-9aad-dbf5d35f2f7a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:53:13.972081 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.972050 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-dshm" (OuterVolumeSpecName: "dshm") pod "131857a1-bbae-4961-9aad-dbf5d35f2f7a" (UID: "131857a1-bbae-4961-9aad-dbf5d35f2f7a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:53:13.972593 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.972562 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131857a1-bbae-4961-9aad-dbf5d35f2f7a-kube-api-access-mk4xs" (OuterVolumeSpecName: "kube-api-access-mk4xs") pod "131857a1-bbae-4961-9aad-dbf5d35f2f7a" (UID: "131857a1-bbae-4961-9aad-dbf5d35f2f7a"). InnerVolumeSpecName "kube-api-access-mk4xs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:53:13.973504 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.973467 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131857a1-bbae-4961-9aad-dbf5d35f2f7a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "131857a1-bbae-4961-9aad-dbf5d35f2f7a" (UID: "131857a1-bbae-4961-9aad-dbf5d35f2f7a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:53:13.990486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:13.990410 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "131857a1-bbae-4961-9aad-dbf5d35f2f7a" (UID: "131857a1-bbae-4961-9aad-dbf5d35f2f7a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:53:14.046330 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.046281 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3f5b23b3-20a2-4783-8213-bc6d78b40f1e" (UID: "3f5b23b3-20a2-4783-8213-bc6d78b40f1e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:53:14.063945 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.063900 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r_131857a1-bbae-4961-9aad-dbf5d35f2f7a/main/0.log" Apr 16 18:53:14.064948 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.064730 2576 generic.go:358] "Generic (PLEG): container finished" podID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerID="a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08" exitCode=137 Apr 16 18:53:14.064948 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.064780 2576 generic.go:358] "Generic (PLEG): container finished" podID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerID="b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765" exitCode=0 Apr 16 18:53:14.064948 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.064788 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" event={"ID":"131857a1-bbae-4961-9aad-dbf5d35f2f7a","Type":"ContainerDied","Data":"a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08"} Apr 16 18:53:14.064948 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.064843 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" Apr 16 18:53:14.064948 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.064861 2576 scope.go:117] "RemoveContainer" containerID="a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08" Apr 16 18:53:14.065317 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.064843 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" event={"ID":"131857a1-bbae-4961-9aad-dbf5d35f2f7a","Type":"ContainerDied","Data":"b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765"} Apr 16 18:53:14.065317 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.065006 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r" event={"ID":"131857a1-bbae-4961-9aad-dbf5d35f2f7a","Type":"ContainerDied","Data":"248a1ced5d5888e3a5066f775873a5759aaea07a391fab1ff8cfe7553d9b810c"} Apr 16 18:53:14.066605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.066442 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qzddw\" (UniqueName: \"kubernetes.io/projected/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-kube-api-access-qzddw\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:53:14.066605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.066469 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:53:14.066605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.066483 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:53:14.066605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.066496 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:53:14.066605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.066510 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:53:14.066605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.066523 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:53:14.066605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.066538 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:53:14.066605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.066556 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f5b23b3-20a2-4783-8213-bc6d78b40f1e-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:53:14.066605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.066570 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mk4xs\" (UniqueName: \"kubernetes.io/projected/131857a1-bbae-4961-9aad-dbf5d35f2f7a-kube-api-access-mk4xs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:53:14.066605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.066584 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/131857a1-bbae-4961-9aad-dbf5d35f2f7a-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:53:14.066605 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.066596 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/131857a1-bbae-4961-9aad-dbf5d35f2f7a-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:53:14.068169 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.068132 2576 generic.go:358] "Generic (PLEG): container finished" podID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerID="65aa909332d6e80a3cc4985043643fc37a0838468c0d92bb464427ec21f95e1f" exitCode=137 Apr 16 18:53:14.068290 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.068221 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" event={"ID":"3f5b23b3-20a2-4783-8213-bc6d78b40f1e","Type":"ContainerDied","Data":"65aa909332d6e80a3cc4985043643fc37a0838468c0d92bb464427ec21f95e1f"} Apr 16 18:53:14.068290 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.068248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" event={"ID":"3f5b23b3-20a2-4783-8213-bc6d78b40f1e","Type":"ContainerDied","Data":"9151c495e53f65ffd1fdb9f68020dd4735a4b6ace2208003d1a5b1ca08192a27"} Apr 16 18:53:14.068411 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.068356 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf" Apr 16 18:53:14.113665 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.113297 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r"] Apr 16 18:53:14.115852 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.115824 2576 scope.go:117] "RemoveContainer" containerID="379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46" Apr 16 18:53:14.118176 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.118145 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-779d88d-9nb2r"] Apr 16 18:53:14.129238 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.129195 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf"] Apr 16 18:53:14.136156 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.136098 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6fxhqlf"] Apr 16 18:53:14.163063 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.163038 2576 scope.go:117] "RemoveContainer" containerID="b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765" Apr 16 18:53:14.175612 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.175574 2576 scope.go:117] "RemoveContainer" containerID="a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08" Apr 16 18:53:14.176071 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:53:14.176042 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08\": container with ID starting with a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08 not found: ID does not exist" containerID="a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08" Apr 16 18:53:14.176198 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.176083 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08"} err="failed to get container status \"a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08\": rpc error: code = NotFound desc = could not find container \"a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08\": container with ID starting with a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08 not found: ID does not exist" Apr 16 18:53:14.176198 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.176125 2576 scope.go:117] "RemoveContainer" containerID="379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46" Apr 16 18:53:14.176523 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:53:14.176494 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46\": container with ID starting with 379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46 not found: ID does not exist" containerID="379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46" Apr 16 18:53:14.176649 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.176540 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46"} err="failed to get container status \"379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46\": rpc error: code = NotFound desc = could not find container \"379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46\": container with ID starting with 379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46 not found: ID does not exist" Apr 16 18:53:14.176649 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.176564 2576 scope.go:117] "RemoveContainer" containerID="b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765" Apr 16 18:53:14.176908 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:53:14.176880 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765\": container with ID starting with b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765 not found: ID does not exist" containerID="b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765" Apr 16 18:53:14.177070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.177038 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765"} err="failed to get container status \"b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765\": rpc error: code = NotFound desc = could not find container \"b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765\": container with ID starting with b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765 not found: ID does not exist" Apr 16 18:53:14.177178 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.177072 2576 scope.go:117] "RemoveContainer" containerID="a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08" Apr 16 18:53:14.177388 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.177360 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08"} err="failed to get container status \"a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08\": rpc error: code = NotFound desc = could not find container \"a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08\": container with ID starting with a1601a7fe1a9cd3d7f06301b20c477f2bb094428ee65329ac3e82b9d95796d08 not found: ID does not exist" Apr 16 18:53:14.177478 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.177392 2576 scope.go:117] "RemoveContainer" containerID="379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46" Apr 16 18:53:14.177835 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.177808 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46"} err="failed to get container status \"379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46\": rpc error: code = NotFound desc = could not find container \"379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46\": container with ID starting with 379fa597695df33d4416d786cbe473c09599175ef6e8a301c5e837eb10480f46 not found: ID does not exist" Apr 16 18:53:14.177943 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.177836 2576 scope.go:117] "RemoveContainer" containerID="b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765" Apr 16 18:53:14.178124 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.178099 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765"} err="failed to get container status \"b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765\": rpc error: code = NotFound desc = could not find container \"b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765\": container with ID starting with b76954e1e4b2bb4cfd5537e71815c470386303b96b76bcd31136c71cfb34e765 not found: ID does not exist" Apr 16 18:53:14.178216 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.178127 2576 scope.go:117] "RemoveContainer" containerID="65aa909332d6e80a3cc4985043643fc37a0838468c0d92bb464427ec21f95e1f" Apr 16 18:53:14.210863 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.210836 2576 scope.go:117] "RemoveContainer" containerID="7e4829fade3a4652b85490a0c17a85381b1644383ad0b11d9351f19e8165c9bd" Apr 16 18:53:14.302280 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.302246 2576 scope.go:117] "RemoveContainer" containerID="65aa909332d6e80a3cc4985043643fc37a0838468c0d92bb464427ec21f95e1f" Apr 16 18:53:14.302724 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:53:14.302697 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65aa909332d6e80a3cc4985043643fc37a0838468c0d92bb464427ec21f95e1f\": container with ID starting with 65aa909332d6e80a3cc4985043643fc37a0838468c0d92bb464427ec21f95e1f not found: ID does not exist" containerID="65aa909332d6e80a3cc4985043643fc37a0838468c0d92bb464427ec21f95e1f" Apr 16 18:53:14.302912 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.302735 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65aa909332d6e80a3cc4985043643fc37a0838468c0d92bb464427ec21f95e1f"} err="failed to get container status \"65aa909332d6e80a3cc4985043643fc37a0838468c0d92bb464427ec21f95e1f\": rpc error: code = NotFound desc = could not find container \"65aa909332d6e80a3cc4985043643fc37a0838468c0d92bb464427ec21f95e1f\": container with ID starting with 65aa909332d6e80a3cc4985043643fc37a0838468c0d92bb464427ec21f95e1f not found: ID does not exist" Apr 16 18:53:14.302912 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.302793 2576 scope.go:117] "RemoveContainer" containerID="7e4829fade3a4652b85490a0c17a85381b1644383ad0b11d9351f19e8165c9bd" Apr 16 18:53:14.303361 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:53:14.303320 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e4829fade3a4652b85490a0c17a85381b1644383ad0b11d9351f19e8165c9bd\": container with ID starting with 7e4829fade3a4652b85490a0c17a85381b1644383ad0b11d9351f19e8165c9bd not found: ID does not exist" containerID="7e4829fade3a4652b85490a0c17a85381b1644383ad0b11d9351f19e8165c9bd" Apr 16 18:53:14.303475 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.303357 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e4829fade3a4652b85490a0c17a85381b1644383ad0b11d9351f19e8165c9bd"} err="failed to get container status \"7e4829fade3a4652b85490a0c17a85381b1644383ad0b11d9351f19e8165c9bd\": rpc error: code = NotFound desc = could not find container \"7e4829fade3a4652b85490a0c17a85381b1644383ad0b11d9351f19e8165c9bd\": container with ID starting with 7e4829fade3a4652b85490a0c17a85381b1644383ad0b11d9351f19e8165c9bd not found: ID does not exist" Apr 16 18:53:14.509941 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.509900 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" path="/var/lib/kubelet/pods/131857a1-bbae-4961-9aad-dbf5d35f2f7a/volumes" Apr 16 18:53:14.510661 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:14.510634 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" path="/var/lib/kubelet/pods/3f5b23b3-20a2-4783-8213-bc6d78b40f1e/volumes" Apr 16 18:53:15.055776 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:15.055712 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:15.055776 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:15.055777 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:15.057480 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:15.057436 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:53:15.078169 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:15.078134 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:53:15.085510 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:15.085480 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:15.085768 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:15.085753 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:53:15.087040 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:15.087005 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:53:23.320696 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:23.320644 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:53:25.055643 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:25.055588 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:53:25.085590 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:25.085536 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:53:33.321033 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:33.320925 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:53:35.057158 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:35.057110 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:53:35.085688 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:35.085637 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:53:43.321229 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:43.321167 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:53:45.056137 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:45.056084 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:53:45.086200 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:45.086142 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:53:53.321408 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:53.321360 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:53:55.056152 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:55.056105 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:53:55.085518 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:53:55.085474 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:54:03.321158 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:03.321104 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:54:05.056427 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:05.056363 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:54:05.086158 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:05.086106 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:54:13.321421 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:13.321363 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:54:15.055734 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:15.055682 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:54:15.085983 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:15.085934 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:54:23.320801 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:23.320759 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:54:25.055723 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:25.055674 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:54:25.085652 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:25.085612 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:54:33.320819 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:33.320769 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:54:35.056441 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:35.056385 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:54:35.085553 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:35.085509 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:54:43.320732 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:43.320683 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:54:45.056555 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:45.056501 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:54:45.086123 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:45.086075 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:54:53.321018 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:53.320967 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:54:55.056181 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:55.056137 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:54:55.086381 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:54:55.086343 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:55:03.320922 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:03.320815 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:55:05.056125 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:05.056068 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:55:05.086457 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:05.086406 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:55:13.320794 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:13.320730 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:55:15.056439 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:15.056397 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:55:15.086145 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:15.086099 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:55:23.320728 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:23.320684 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 18:55:25.055836 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:25.055784 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:55:25.086149 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:25.086090 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:55:33.330567 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:33.330530 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:55:33.338757 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:33.338722 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:55:35.056392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:35.056331 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:55:35.086412 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:35.086363 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:55:45.056412 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:45.056351 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:55:45.085945 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:45.085890 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:55:49.526834 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:49.526797 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:55:49.527309 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:49.527090 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" containerID="cri-o://8c4ce71fdad3ccf6521fbe2aebbaae7ed22a7435d0ac4dff0bd9fdbac975628b" gracePeriod=30 Apr 16 18:55:50.797229 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.797192 2576 generic.go:358] "Generic (PLEG): container finished" podID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerID="8c4ce71fdad3ccf6521fbe2aebbaae7ed22a7435d0ac4dff0bd9fdbac975628b" exitCode=0 Apr 16 18:55:50.797680 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.797268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"8e5697f0-bd88-45a8-a73b-2b20ff90bd29","Type":"ContainerDied","Data":"8c4ce71fdad3ccf6521fbe2aebbaae7ed22a7435d0ac4dff0bd9fdbac975628b"} Apr 16 18:55:50.890863 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.890838 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:55:50.947505 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.947473 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-tls-certs\") pod \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " Apr 16 18:55:50.947697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.947541 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-dshm\") pod \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " Apr 16 18:55:50.947697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.947570 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5k6c\" (UniqueName: \"kubernetes.io/projected/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-kube-api-access-q5k6c\") pod \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " Apr 16 18:55:50.947829 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.947690 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-kserve-provision-location\") pod \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " Apr 16 18:55:50.947829 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.947751 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-model-cache\") pod \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " Apr 16 18:55:50.947829 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.947798 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-home\") pod \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\" (UID: \"8e5697f0-bd88-45a8-a73b-2b20ff90bd29\") " Apr 16 18:55:50.948057 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.948032 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-model-cache" (OuterVolumeSpecName: "model-cache") pod "8e5697f0-bd88-45a8-a73b-2b20ff90bd29" (UID: "8e5697f0-bd88-45a8-a73b-2b20ff90bd29"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:50.948145 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.948131 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:55:50.948300 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.948275 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-home" (OuterVolumeSpecName: "home") pod "8e5697f0-bd88-45a8-a73b-2b20ff90bd29" (UID: "8e5697f0-bd88-45a8-a73b-2b20ff90bd29"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:50.949978 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.949950 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-kube-api-access-q5k6c" (OuterVolumeSpecName: "kube-api-access-q5k6c") pod "8e5697f0-bd88-45a8-a73b-2b20ff90bd29" (UID: "8e5697f0-bd88-45a8-a73b-2b20ff90bd29"). InnerVolumeSpecName "kube-api-access-q5k6c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:55:50.950230 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.950196 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8e5697f0-bd88-45a8-a73b-2b20ff90bd29" (UID: "8e5697f0-bd88-45a8-a73b-2b20ff90bd29"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:55:50.950230 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:50.950204 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-dshm" (OuterVolumeSpecName: "dshm") pod "8e5697f0-bd88-45a8-a73b-2b20ff90bd29" (UID: "8e5697f0-bd88-45a8-a73b-2b20ff90bd29"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:51.009416 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.009370 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8e5697f0-bd88-45a8-a73b-2b20ff90bd29" (UID: "8e5697f0-bd88-45a8-a73b-2b20ff90bd29"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:51.049374 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.049339 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:55:51.049374 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.049370 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:55:51.049374 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.049380 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q5k6c\" (UniqueName: \"kubernetes.io/projected/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-kube-api-access-q5k6c\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:55:51.049593 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.049391 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:55:51.049593 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.049400 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8e5697f0-bd88-45a8-a73b-2b20ff90bd29-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:55:51.803003 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.802965 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"8e5697f0-bd88-45a8-a73b-2b20ff90bd29","Type":"ContainerDied","Data":"802ef8bfdb4d30fc010b988457be4193834f591bc79858c9bb21d3d396122396"} Apr 16 18:55:51.803450 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.803014 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:55:51.803450 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.803023 2576 scope.go:117] "RemoveContainer" containerID="8c4ce71fdad3ccf6521fbe2aebbaae7ed22a7435d0ac4dff0bd9fdbac975628b" Apr 16 18:55:51.826401 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.826356 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:55:51.830438 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.830406 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:55:51.833070 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.833043 2576 scope.go:117] "RemoveContainer" containerID="600833f62324e0968b744b8522d97b5ec7872fd5381940aadacaa759850986a7" Apr 16 18:55:51.848667 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.848633 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:55:51.848992 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:51.848965 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="71e8f634-277f-41a1-9c62-78279d199716" containerName="main" containerID="cri-o://5a99dc081a5cadc239b43ade7cc4e3176772f82abe9bf180220e6c12e029d50e" gracePeriod=30 Apr 16 18:55:52.507243 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.507208 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" path="/var/lib/kubelet/pods/8e5697f0-bd88-45a8-a73b-2b20ff90bd29/volumes" Apr 16 18:55:52.845840 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.845725 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75"] Apr 16 18:55:52.846820 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846790 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="llm-d-routing-sidecar" Apr 16 18:55:52.846820 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846819 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="llm-d-routing-sidecar" Apr 16 18:55:52.847021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846840 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" Apr 16 18:55:52.847021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846849 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" Apr 16 18:55:52.847021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846859 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" Apr 16 18:55:52.847021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846867 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" Apr 16 18:55:52.847021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846896 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" Apr 16 18:55:52.847021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846904 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" Apr 16 18:55:52.847021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846916 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="storage-initializer" Apr 16 18:55:52.847021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846924 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="storage-initializer" Apr 16 18:55:52.847021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846932 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="storage-initializer" Apr 16 18:55:52.847021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846940 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="storage-initializer" Apr 16 18:55:52.847021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846950 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="storage-initializer" Apr 16 18:55:52.847021 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.846958 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="storage-initializer" Apr 16 18:55:52.847596 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.847056 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f5b23b3-20a2-4783-8213-bc6d78b40f1e" containerName="main" Apr 16 18:55:52.847596 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.847070 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="main" Apr 16 18:55:52.847596 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.847083 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="131857a1-bbae-4961-9aad-dbf5d35f2f7a" containerName="llm-d-routing-sidecar" Apr 16 18:55:52.847596 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.847093 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e5697f0-bd88-45a8-a73b-2b20ff90bd29" containerName="main" Apr 16 18:55:52.852710 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.852684 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:52.856223 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.856198 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 18:55:52.856962 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.856936 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-fn5wr\"" Apr 16 18:55:52.875399 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.875369 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75"] Apr 16 18:55:52.969495 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.969458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:52.969693 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.969505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:52.969693 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.969576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:52.969859 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.969729 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:52.969859 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.969815 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kbgn\" (UniqueName: \"kubernetes.io/projected/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-kube-api-access-4kbgn\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:52.969859 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:52.969849 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.071044 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.071007 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.071237 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.071055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.071237 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.071082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.071237 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.071151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.071237 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.071198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kbgn\" (UniqueName: \"kubernetes.io/projected/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-kube-api-access-4kbgn\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.071237 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.071228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.071520 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.071420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.071520 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.071506 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.071628 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.071584 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.071834 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.071809 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.073854 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.073825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.082555 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.082522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kbgn\" (UniqueName: \"kubernetes.io/projected/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-kube-api-access-4kbgn\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.166351 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.166305 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:53.198798 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.198772 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:55:53.273342 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.273313 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-model-cache\") pod \"71e8f634-277f-41a1-9c62-78279d199716\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " Apr 16 18:55:53.273501 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.273349 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-kserve-provision-location\") pod \"71e8f634-277f-41a1-9c62-78279d199716\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " Apr 16 18:55:53.273501 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.273389 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2fmp\" (UniqueName: \"kubernetes.io/projected/71e8f634-277f-41a1-9c62-78279d199716-kube-api-access-b2fmp\") pod \"71e8f634-277f-41a1-9c62-78279d199716\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " Apr 16 18:55:53.273501 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.273420 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-home\") pod \"71e8f634-277f-41a1-9c62-78279d199716\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " Apr 16 18:55:53.273501 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.273493 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-dshm\") pod \"71e8f634-277f-41a1-9c62-78279d199716\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " Apr 16 18:55:53.273766 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.273523 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71e8f634-277f-41a1-9c62-78279d199716-tls-certs\") pod \"71e8f634-277f-41a1-9c62-78279d199716\" (UID: \"71e8f634-277f-41a1-9c62-78279d199716\") " Apr 16 18:55:53.274183 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.273960 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-model-cache" (OuterVolumeSpecName: "model-cache") pod "71e8f634-277f-41a1-9c62-78279d199716" (UID: "71e8f634-277f-41a1-9c62-78279d199716"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:53.274183 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.274164 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-home" (OuterVolumeSpecName: "home") pod "71e8f634-277f-41a1-9c62-78279d199716" (UID: "71e8f634-277f-41a1-9c62-78279d199716"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:53.276977 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.276936 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71e8f634-277f-41a1-9c62-78279d199716-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "71e8f634-277f-41a1-9c62-78279d199716" (UID: "71e8f634-277f-41a1-9c62-78279d199716"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:55:53.277099 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.277079 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-dshm" (OuterVolumeSpecName: "dshm") pod "71e8f634-277f-41a1-9c62-78279d199716" (UID: "71e8f634-277f-41a1-9c62-78279d199716"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:53.277167 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.277124 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71e8f634-277f-41a1-9c62-78279d199716-kube-api-access-b2fmp" (OuterVolumeSpecName: "kube-api-access-b2fmp") pod "71e8f634-277f-41a1-9c62-78279d199716" (UID: "71e8f634-277f-41a1-9c62-78279d199716"). InnerVolumeSpecName "kube-api-access-b2fmp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:55:53.307669 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.307623 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "71e8f634-277f-41a1-9c62-78279d199716" (UID: "71e8f634-277f-41a1-9c62-78279d199716"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:53.313583 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.313443 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75"] Apr 16 18:55:53.316317 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:55:53.316287 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97a90d5c_e8b3_421d_bba8_75cc3e2d7e32.slice/crio-49c764a98964a541b57523268818de6b078866117e861ca49cddf4a41ac2a1f0 WatchSource:0}: Error finding container 49c764a98964a541b57523268818de6b078866117e861ca49cddf4a41ac2a1f0: Status 404 returned error can't find the container with id 49c764a98964a541b57523268818de6b078866117e861ca49cddf4a41ac2a1f0 Apr 16 18:55:53.374842 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.374816 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:55:53.374842 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.374846 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:55:53.375034 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.374862 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b2fmp\" (UniqueName: \"kubernetes.io/projected/71e8f634-277f-41a1-9c62-78279d199716-kube-api-access-b2fmp\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:55:53.375034 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.374878 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:55:53.375034 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.374892 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/71e8f634-277f-41a1-9c62-78279d199716-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:55:53.375034 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.374902 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71e8f634-277f-41a1-9c62-78279d199716-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:55:53.815213 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.815179 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" event={"ID":"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32","Type":"ContainerStarted","Data":"77f09cf3038f46f90742b94b696e2242fe98d7d90859e52ef2306ff73bc8eebc"} Apr 16 18:55:53.815213 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.815221 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" event={"ID":"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32","Type":"ContainerStarted","Data":"49c764a98964a541b57523268818de6b078866117e861ca49cddf4a41ac2a1f0"} Apr 16 18:55:53.816934 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.816900 2576 generic.go:358] "Generic (PLEG): container finished" podID="71e8f634-277f-41a1-9c62-78279d199716" containerID="5a99dc081a5cadc239b43ade7cc4e3176772f82abe9bf180220e6c12e029d50e" exitCode=0 Apr 16 18:55:53.817073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.816962 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:55:53.817073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.816975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"71e8f634-277f-41a1-9c62-78279d199716","Type":"ContainerDied","Data":"5a99dc081a5cadc239b43ade7cc4e3176772f82abe9bf180220e6c12e029d50e"} Apr 16 18:55:53.817073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.817012 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"71e8f634-277f-41a1-9c62-78279d199716","Type":"ContainerDied","Data":"86d7aa5361092873fe5834ef59bec53dadb1642bb37cf5ff93a6b112ed094473"} Apr 16 18:55:53.817073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.817035 2576 scope.go:117] "RemoveContainer" containerID="5a99dc081a5cadc239b43ade7cc4e3176772f82abe9bf180220e6c12e029d50e" Apr 16 18:55:53.839236 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.839189 2576 scope.go:117] "RemoveContainer" containerID="787d43c0b22a235203bcd35fb634d83e6cbfc28ec0628a7dc9322d46dc289fb2" Apr 16 18:55:53.854942 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.854901 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:55:53.859781 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.859733 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:55:53.870915 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.870888 2576 scope.go:117] "RemoveContainer" containerID="5a99dc081a5cadc239b43ade7cc4e3176772f82abe9bf180220e6c12e029d50e" Apr 16 18:55:53.871272 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:55:53.871245 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a99dc081a5cadc239b43ade7cc4e3176772f82abe9bf180220e6c12e029d50e\": container with ID starting with 5a99dc081a5cadc239b43ade7cc4e3176772f82abe9bf180220e6c12e029d50e not found: ID does not exist" containerID="5a99dc081a5cadc239b43ade7cc4e3176772f82abe9bf180220e6c12e029d50e" Apr 16 18:55:53.871369 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.871283 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a99dc081a5cadc239b43ade7cc4e3176772f82abe9bf180220e6c12e029d50e"} err="failed to get container status \"5a99dc081a5cadc239b43ade7cc4e3176772f82abe9bf180220e6c12e029d50e\": rpc error: code = NotFound desc = could not find container \"5a99dc081a5cadc239b43ade7cc4e3176772f82abe9bf180220e6c12e029d50e\": container with ID starting with 5a99dc081a5cadc239b43ade7cc4e3176772f82abe9bf180220e6c12e029d50e not found: ID does not exist" Apr 16 18:55:53.871369 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.871313 2576 scope.go:117] "RemoveContainer" containerID="787d43c0b22a235203bcd35fb634d83e6cbfc28ec0628a7dc9322d46dc289fb2" Apr 16 18:55:53.871614 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:55:53.871596 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"787d43c0b22a235203bcd35fb634d83e6cbfc28ec0628a7dc9322d46dc289fb2\": container with ID starting with 787d43c0b22a235203bcd35fb634d83e6cbfc28ec0628a7dc9322d46dc289fb2 not found: ID does not exist" containerID="787d43c0b22a235203bcd35fb634d83e6cbfc28ec0628a7dc9322d46dc289fb2" Apr 16 18:55:53.871654 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:53.871620 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"787d43c0b22a235203bcd35fb634d83e6cbfc28ec0628a7dc9322d46dc289fb2"} err="failed to get container status \"787d43c0b22a235203bcd35fb634d83e6cbfc28ec0628a7dc9322d46dc289fb2\": rpc error: code = NotFound desc = could not find container \"787d43c0b22a235203bcd35fb634d83e6cbfc28ec0628a7dc9322d46dc289fb2\": container with ID starting with 787d43c0b22a235203bcd35fb634d83e6cbfc28ec0628a7dc9322d46dc289fb2 not found: ID does not exist" Apr 16 18:55:54.506510 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:54.506431 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71e8f634-277f-41a1-9c62-78279d199716" path="/var/lib/kubelet/pods/71e8f634-277f-41a1-9c62-78279d199716/volumes" Apr 16 18:55:54.823283 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:54.823192 2576 generic.go:358] "Generic (PLEG): container finished" podID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" containerID="77f09cf3038f46f90742b94b696e2242fe98d7d90859e52ef2306ff73bc8eebc" exitCode=0 Apr 16 18:55:54.823283 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:54.823265 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" event={"ID":"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32","Type":"ContainerDied","Data":"77f09cf3038f46f90742b94b696e2242fe98d7d90859e52ef2306ff73bc8eebc"} Apr 16 18:55:55.055986 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:55.055935 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:55:55.085991 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:55.085889 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:55:55.829824 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:55.829790 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" event={"ID":"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32","Type":"ContainerStarted","Data":"fc8996fa4c6ea733a0fec0257008b965eb6209643b9e3d0b4626f457cf4fa622"} Apr 16 18:55:55.830010 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:55.829831 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" event={"ID":"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32","Type":"ContainerStarted","Data":"045d9f95fde1200dc3f36972869417bf5655874fe8a5403b4494b4caaffd4b8a"} Apr 16 18:55:55.830010 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:55.829916 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:55:55.852688 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:55:55.852638 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" podStartSLOduration=3.852620131 podStartE2EDuration="3.852620131s" podCreationTimestamp="2026-04-16 18:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:55:55.850265436 +0000 UTC m=+1481.974793101" watchObservedRunningTime="2026-04-16 18:55:55.852620131 +0000 UTC m=+1481.977147794" Apr 16 18:56:03.166819 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:03.166767 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:56:03.166819 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:03.166823 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:56:03.169676 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:03.169647 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:56:03.867448 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:03.867416 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:56:05.056550 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:05.056504 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:56:05.086379 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:05.086327 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:56:14.548718 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:14.548691 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/2.log" Apr 16 18:56:14.555134 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:14.555110 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-140-154.ec2.internal_28c07a342a30c0e354482d7284dcbb2c/kube-rbac-proxy-crio/2.log" Apr 16 18:56:15.056668 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:15.056617 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8001/health\": dial tcp 10.134.0.56:8001: connect: connection refused" Apr 16 18:56:15.085835 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:15.085785 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 18:56:24.872683 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:24.872648 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:56:25.065394 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:25.065355 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:56:25.083938 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:25.083911 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:56:25.097200 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:25.097163 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:56:25.105146 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:25.105120 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:56:26.636690 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:26.636650 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75"] Apr 16 18:56:26.637131 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:26.637035 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" podUID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" containerName="main" containerID="cri-o://045d9f95fde1200dc3f36972869417bf5655874fe8a5403b4494b4caaffd4b8a" gracePeriod=30 Apr 16 18:56:26.637208 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:26.637113 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" podUID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" containerName="tokenizer" containerID="cri-o://fc8996fa4c6ea733a0fec0257008b965eb6209643b9e3d0b4626f457cf4fa622" gracePeriod=30 Apr 16 18:56:26.964952 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:26.964918 2576 generic.go:358] "Generic (PLEG): container finished" podID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" containerID="045d9f95fde1200dc3f36972869417bf5655874fe8a5403b4494b4caaffd4b8a" exitCode=0 Apr 16 18:56:26.965144 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:26.964998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" event={"ID":"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32","Type":"ContainerDied","Data":"045d9f95fde1200dc3f36972869417bf5655874fe8a5403b4494b4caaffd4b8a"} Apr 16 18:56:28.108518 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.108478 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:56:28.198562 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.198469 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-cache\") pod \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " Apr 16 18:56:28.198562 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.198511 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-uds\") pod \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " Apr 16 18:56:28.198562 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.198555 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-kserve-provision-location\") pod \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " Apr 16 18:56:28.198877 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.198578 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tls-certs\") pod \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " Apr 16 18:56:28.198877 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.198656 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kbgn\" (UniqueName: \"kubernetes.io/projected/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-kube-api-access-4kbgn\") pod \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " Apr 16 18:56:28.198877 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.198688 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-tmp\") pod \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\" (UID: \"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32\") " Apr 16 18:56:28.198877 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.198716 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" (UID: "97a90d5c-e8b3-421d-bba8-75cc3e2d7e32"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:28.199014 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.198887 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" (UID: "97a90d5c-e8b3-421d-bba8-75cc3e2d7e32"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:28.199056 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.199009 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:56:28.199104 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.199058 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-uds\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:56:28.199189 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.199163 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" (UID: "97a90d5c-e8b3-421d-bba8-75cc3e2d7e32"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:28.199375 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.199350 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" (UID: "97a90d5c-e8b3-421d-bba8-75cc3e2d7e32"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:28.200901 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.200879 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-kube-api-access-4kbgn" (OuterVolumeSpecName: "kube-api-access-4kbgn") pod "97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" (UID: "97a90d5c-e8b3-421d-bba8-75cc3e2d7e32"). InnerVolumeSpecName "kube-api-access-4kbgn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:56:28.200901 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.200878 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" (UID: "97a90d5c-e8b3-421d-bba8-75cc3e2d7e32"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:56:28.299934 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.299890 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:56:28.299934 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.299939 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:56:28.300139 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.299959 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4kbgn\" (UniqueName: \"kubernetes.io/projected/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-kube-api-access-4kbgn\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:56:28.300139 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.299975 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32-tokenizer-tmp\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:56:28.978610 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.978571 2576 generic.go:358] "Generic (PLEG): container finished" podID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" containerID="fc8996fa4c6ea733a0fec0257008b965eb6209643b9e3d0b4626f457cf4fa622" exitCode=0 Apr 16 18:56:28.978610 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.978608 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" event={"ID":"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32","Type":"ContainerDied","Data":"fc8996fa4c6ea733a0fec0257008b965eb6209643b9e3d0b4626f457cf4fa622"} Apr 16 18:56:28.978897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.978654 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" event={"ID":"97a90d5c-e8b3-421d-bba8-75cc3e2d7e32","Type":"ContainerDied","Data":"49c764a98964a541b57523268818de6b078866117e861ca49cddf4a41ac2a1f0"} Apr 16 18:56:28.978897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.978665 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75" Apr 16 18:56:28.978897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.978674 2576 scope.go:117] "RemoveContainer" containerID="fc8996fa4c6ea733a0fec0257008b965eb6209643b9e3d0b4626f457cf4fa622" Apr 16 18:56:28.987960 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.987932 2576 scope.go:117] "RemoveContainer" containerID="045d9f95fde1200dc3f36972869417bf5655874fe8a5403b4494b4caaffd4b8a" Apr 16 18:56:28.997039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.997010 2576 scope.go:117] "RemoveContainer" containerID="77f09cf3038f46f90742b94b696e2242fe98d7d90859e52ef2306ff73bc8eebc" Apr 16 18:56:28.998851 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:28.998828 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75"] Apr 16 18:56:29.006776 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:29.006723 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8pkb75"] Apr 16 18:56:29.007559 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:29.007535 2576 scope.go:117] "RemoveContainer" containerID="fc8996fa4c6ea733a0fec0257008b965eb6209643b9e3d0b4626f457cf4fa622" Apr 16 18:56:29.007938 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:56:29.007911 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc8996fa4c6ea733a0fec0257008b965eb6209643b9e3d0b4626f457cf4fa622\": container with ID starting with fc8996fa4c6ea733a0fec0257008b965eb6209643b9e3d0b4626f457cf4fa622 not found: ID does not exist" containerID="fc8996fa4c6ea733a0fec0257008b965eb6209643b9e3d0b4626f457cf4fa622" Apr 16 18:56:29.008017 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:29.007949 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8996fa4c6ea733a0fec0257008b965eb6209643b9e3d0b4626f457cf4fa622"} err="failed to get container status \"fc8996fa4c6ea733a0fec0257008b965eb6209643b9e3d0b4626f457cf4fa622\": rpc error: code = NotFound desc = could not find container \"fc8996fa4c6ea733a0fec0257008b965eb6209643b9e3d0b4626f457cf4fa622\": container with ID starting with fc8996fa4c6ea733a0fec0257008b965eb6209643b9e3d0b4626f457cf4fa622 not found: ID does not exist" Apr 16 18:56:29.008017 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:29.007972 2576 scope.go:117] "RemoveContainer" containerID="045d9f95fde1200dc3f36972869417bf5655874fe8a5403b4494b4caaffd4b8a" Apr 16 18:56:29.008279 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:56:29.008262 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"045d9f95fde1200dc3f36972869417bf5655874fe8a5403b4494b4caaffd4b8a\": container with ID starting with 045d9f95fde1200dc3f36972869417bf5655874fe8a5403b4494b4caaffd4b8a not found: ID does not exist" containerID="045d9f95fde1200dc3f36972869417bf5655874fe8a5403b4494b4caaffd4b8a" Apr 16 18:56:29.008331 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:29.008288 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"045d9f95fde1200dc3f36972869417bf5655874fe8a5403b4494b4caaffd4b8a"} err="failed to get container status \"045d9f95fde1200dc3f36972869417bf5655874fe8a5403b4494b4caaffd4b8a\": rpc error: code = NotFound desc = could not find container \"045d9f95fde1200dc3f36972869417bf5655874fe8a5403b4494b4caaffd4b8a\": container with ID starting with 045d9f95fde1200dc3f36972869417bf5655874fe8a5403b4494b4caaffd4b8a not found: ID does not exist" Apr 16 18:56:29.008331 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:29.008306 2576 scope.go:117] "RemoveContainer" containerID="77f09cf3038f46f90742b94b696e2242fe98d7d90859e52ef2306ff73bc8eebc" Apr 16 18:56:29.008565 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:56:29.008548 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f09cf3038f46f90742b94b696e2242fe98d7d90859e52ef2306ff73bc8eebc\": container with ID starting with 77f09cf3038f46f90742b94b696e2242fe98d7d90859e52ef2306ff73bc8eebc not found: ID does not exist" containerID="77f09cf3038f46f90742b94b696e2242fe98d7d90859e52ef2306ff73bc8eebc" Apr 16 18:56:29.008610 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:29.008571 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f09cf3038f46f90742b94b696e2242fe98d7d90859e52ef2306ff73bc8eebc"} err="failed to get container status \"77f09cf3038f46f90742b94b696e2242fe98d7d90859e52ef2306ff73bc8eebc\": rpc error: code = NotFound desc = could not find container \"77f09cf3038f46f90742b94b696e2242fe98d7d90859e52ef2306ff73bc8eebc\": container with ID starting with 77f09cf3038f46f90742b94b696e2242fe98d7d90859e52ef2306ff73bc8eebc not found: ID does not exist" Apr 16 18:56:30.508580 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:30.508538 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" path="/var/lib/kubelet/pods/97a90d5c-e8b3-421d-bba8-75cc3e2d7e32/volumes" Apr 16 18:56:37.911443 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.911356 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t"] Apr 16 18:56:37.911977 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.911723 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" containerID="cri-o://5cab1c4239a39c1cce4e58bdff9b71adc09e0e9a8d530e0f23579ab4849f7d8b" gracePeriod=30 Apr 16 18:56:37.917772 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.917724 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk"] Apr 16 18:56:37.918188 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.918136 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" containerID="cri-o://6fa46587cc664b3a0d7a9e85bc8437e7e5e60477666b9f6723979a3579955a10" gracePeriod=30 Apr 16 18:56:37.923782 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.923734 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9"] Apr 16 18:56:37.924141 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.924128 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" containerName="main" Apr 16 18:56:37.924187 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.924143 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" containerName="main" Apr 16 18:56:37.924187 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.924157 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71e8f634-277f-41a1-9c62-78279d199716" containerName="storage-initializer" Apr 16 18:56:37.924187 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.924163 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e8f634-277f-41a1-9c62-78279d199716" containerName="storage-initializer" Apr 16 18:56:37.924187 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.924172 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71e8f634-277f-41a1-9c62-78279d199716" containerName="main" Apr 16 18:56:37.924187 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.924178 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e8f634-277f-41a1-9c62-78279d199716" containerName="main" Apr 16 18:56:37.924187 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.924183 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" containerName="storage-initializer" Apr 16 18:56:37.924187 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.924189 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" containerName="storage-initializer" Apr 16 18:56:37.924400 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.924199 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" containerName="tokenizer" Apr 16 18:56:37.924400 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.924204 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" containerName="tokenizer" Apr 16 18:56:37.924400 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.924265 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="71e8f634-277f-41a1-9c62-78279d199716" containerName="main" Apr 16 18:56:37.924400 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.924274 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" containerName="tokenizer" Apr 16 18:56:37.924400 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.924284 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="97a90d5c-e8b3-421d-bba8-75cc3e2d7e32" containerName="main" Apr 16 18:56:37.929278 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.929249 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:37.932736 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.932711 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-gdbp5\"" Apr 16 18:56:37.943379 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.943348 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9"] Apr 16 18:56:37.992080 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.992050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:37.992080 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.992082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0866b228-7117-4c13-9d4a-1ca7ba74346b-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:37.992293 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.992117 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0866b228-7117-4c13-9d4a-1ca7ba74346b-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:37.992293 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.992180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:37.992293 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.992206 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:37.992293 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.992230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:37.992293 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.992251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0866b228-7117-4c13-9d4a-1ca7ba74346b-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:37.992293 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.992271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf6c2\" (UniqueName: \"kubernetes.io/projected/0866b228-7117-4c13-9d4a-1ca7ba74346b-kube-api-access-sf6c2\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:37.992519 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:37.992319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.092965 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.092922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.092965 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.092967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0866b228-7117-4c13-9d4a-1ca7ba74346b-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.093237 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.092986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sf6c2\" (UniqueName: \"kubernetes.io/projected/0866b228-7117-4c13-9d4a-1ca7ba74346b-kube-api-access-sf6c2\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.093237 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.093005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.093237 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.093060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.093237 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.093075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0866b228-7117-4c13-9d4a-1ca7ba74346b-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.093237 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.093105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0866b228-7117-4c13-9d4a-1ca7ba74346b-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.093486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.093282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.093486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.093347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.093576 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.093507 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.093633 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.093580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.093709 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.093676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.093785 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.093682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.093868 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.093852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0866b228-7117-4c13-9d4a-1ca7ba74346b-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.095817 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.095796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0866b228-7117-4c13-9d4a-1ca7ba74346b-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.096073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.096054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0866b228-7117-4c13-9d4a-1ca7ba74346b-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.102148 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.102119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf6c2\" (UniqueName: \"kubernetes.io/projected/0866b228-7117-4c13-9d4a-1ca7ba74346b-kube-api-access-sf6c2\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.102280 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.102163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0866b228-7117-4c13-9d4a-1ca7ba74346b-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-dltw9\" (UID: \"0866b228-7117-4c13-9d4a-1ca7ba74346b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.244466 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.244377 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:38.391041 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.390966 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9"] Apr 16 18:56:38.393777 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:56:38.393711 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0866b228_7117_4c13_9d4a_1ca7ba74346b.slice/crio-31b59706944de4daf2717cee5c63ef85c7b0a82d64f4f2913f1506d48d1af3c6 WatchSource:0}: Error finding container 31b59706944de4daf2717cee5c63ef85c7b0a82d64f4f2913f1506d48d1af3c6: Status 404 returned error can't find the container with id 31b59706944de4daf2717cee5c63ef85c7b0a82d64f4f2913f1506d48d1af3c6 Apr 16 18:56:38.396965 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.396929 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:56:38.397090 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.397015 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:56:38.397090 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:38.397061 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:56:39.026296 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:39.026254 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" event={"ID":"0866b228-7117-4c13-9d4a-1ca7ba74346b","Type":"ContainerStarted","Data":"6b541e87339361369d120c7e50001e0360a844d8b2729d099884bdf1afa5384d"} Apr 16 18:56:39.026296 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:39.026300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" event={"ID":"0866b228-7117-4c13-9d4a-1ca7ba74346b","Type":"ContainerStarted","Data":"31b59706944de4daf2717cee5c63ef85c7b0a82d64f4f2913f1506d48d1af3c6"} Apr 16 18:56:39.052756 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:39.052676 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" podStartSLOduration=2.052649523 podStartE2EDuration="2.052649523s" podCreationTimestamp="2026-04-16 18:56:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:56:39.050137515 +0000 UTC m=+1525.174665176" watchObservedRunningTime="2026-04-16 18:56:39.052649523 +0000 UTC m=+1525.177177247" Apr 16 18:56:39.245188 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:39.245143 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:39.246671 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:39.246637 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" podUID="0866b228-7117-4c13-9d4a-1ca7ba74346b" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.59:15021/healthz/ready\": dial tcp 10.134.0.59:15021: connect: connection refused" Apr 16 18:56:40.245055 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:40.245013 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" podUID="0866b228-7117-4c13-9d4a-1ca7ba74346b" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.59:15021/healthz/ready\": dial tcp 10.134.0.59:15021: connect: connection refused" Apr 16 18:56:41.245717 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:41.245677 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" podUID="0866b228-7117-4c13-9d4a-1ca7ba74346b" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.59:15021/healthz/ready\": dial tcp 10.134.0.59:15021: connect: connection refused" Apr 16 18:56:42.248606 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:42.248574 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:42.249013 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:42.248987 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:42.249599 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:42.249582 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-dltw9" Apr 16 18:56:43.120471 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.120433 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj"] Apr 16 18:56:43.124104 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.124077 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.126828 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.126797 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-r6852\"" Apr 16 18:56:43.127320 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.127302 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 18:56:43.136035 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.136010 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj"] Apr 16 18:56:43.141900 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.141864 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88"] Apr 16 18:56:43.145971 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.145943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.163869 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.163834 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88"] Apr 16 18:56:43.243014 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.242958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.243014 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.243018 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtjgq\" (UniqueName: \"kubernetes.io/projected/b24b99b6-ff16-48f2-b45b-723c793fa32a-kube-api-access-jtjgq\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.243266 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.243110 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a670ea36-0015-4862-96d8-f8307904f1d8-tls-certs\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.243266 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.243152 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.243266 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.243177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpbqd\" (UniqueName: \"kubernetes.io/projected/a670ea36-0015-4862-96d8-f8307904f1d8-kube-api-access-vpbqd\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.243266 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.243229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-home\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.243466 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.243281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-home\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.243466 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.243319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b24b99b6-ff16-48f2-b45b-723c793fa32a-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.243466 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.243346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-model-cache\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.243466 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.243391 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.243466 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.243422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-dshm\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.243636 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.243498 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.344820 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.344764 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.344820 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.344828 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-dshm\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.345356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.344867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.345356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.344905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.345356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtjgq\" (UniqueName: \"kubernetes.io/projected/b24b99b6-ff16-48f2-b45b-723c793fa32a-kube-api-access-jtjgq\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.345356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a670ea36-0015-4862-96d8-f8307904f1d8-tls-certs\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.345356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.345356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpbqd\" (UniqueName: \"kubernetes.io/projected/a670ea36-0015-4862-96d8-f8307904f1d8-kube-api-access-vpbqd\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.345356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-home\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.345356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.345356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-home\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.345356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.345356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.345356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345296 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b24b99b6-ff16-48f2-b45b-723c793fa32a-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.345960 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-home\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.345960 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-model-cache\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.345960 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345674 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-home\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.345960 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.345874 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-model-cache\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.347512 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.347478 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.347674 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.347649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a670ea36-0015-4862-96d8-f8307904f1d8-tls-certs\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.347790 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.347699 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-dshm\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.348028 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.348007 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b24b99b6-ff16-48f2-b45b-723c793fa32a-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.354437 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.354409 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtjgq\" (UniqueName: \"kubernetes.io/projected/b24b99b6-ff16-48f2-b45b-723c793fa32a-kube-api-access-jtjgq\") pod \"router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.354551 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.354449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpbqd\" (UniqueName: \"kubernetes.io/projected/a670ea36-0015-4862-96d8-f8307904f1d8-kube-api-access-vpbqd\") pod \"router-with-refs-pd-test-kserve-6cb9df684f-w8ntj\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.435496 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.435455 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:43.460115 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.460076 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:43.593618 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.593490 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj"] Apr 16 18:56:43.594831 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:56:43.594803 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda670ea36_0015_4862_96d8_f8307904f1d8.slice/crio-61eb7646a3c51816efc8cea13e65d8d67139d2d61a479578d36b8a9043b337d0 WatchSource:0}: Error finding container 61eb7646a3c51816efc8cea13e65d8d67139d2d61a479578d36b8a9043b337d0: Status 404 returned error can't find the container with id 61eb7646a3c51816efc8cea13e65d8d67139d2d61a479578d36b8a9043b337d0 Apr 16 18:56:43.616393 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:43.616367 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88"] Apr 16 18:56:43.618553 ip-10-0-140-154 kubenswrapper[2576]: W0416 18:56:43.618526 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb24b99b6_ff16_48f2_b45b_723c793fa32a.slice/crio-c419a06455128ffaa4f691780114b82d7c7f5e10399bb6e693a217feac7306ad WatchSource:0}: Error finding container c419a06455128ffaa4f691780114b82d7c7f5e10399bb6e693a217feac7306ad: Status 404 returned error can't find the container with id c419a06455128ffaa4f691780114b82d7c7f5e10399bb6e693a217feac7306ad Apr 16 18:56:44.050521 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:44.050437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" event={"ID":"b24b99b6-ff16-48f2-b45b-723c793fa32a","Type":"ContainerStarted","Data":"177915605fc3a02415fd0bff82ba361a6a2bbc0639050a93e49ed29729dff286"} Apr 16 18:56:44.050521 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:44.050469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" event={"ID":"b24b99b6-ff16-48f2-b45b-723c793fa32a","Type":"ContainerStarted","Data":"c419a06455128ffaa4f691780114b82d7c7f5e10399bb6e693a217feac7306ad"} Apr 16 18:56:44.051830 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:44.051797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" event={"ID":"a670ea36-0015-4862-96d8-f8307904f1d8","Type":"ContainerStarted","Data":"e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b"} Apr 16 18:56:44.051937 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:44.051836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" event={"ID":"a670ea36-0015-4862-96d8-f8307904f1d8","Type":"ContainerStarted","Data":"61eb7646a3c51816efc8cea13e65d8d67139d2d61a479578d36b8a9043b337d0"} Apr 16 18:56:45.058596 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:45.058547 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" event={"ID":"a670ea36-0015-4862-96d8-f8307904f1d8","Type":"ContainerStarted","Data":"e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875"} Apr 16 18:56:45.059098 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:45.058899 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:49.079020 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:49.078987 2576 generic.go:358] "Generic (PLEG): container finished" podID="a670ea36-0015-4862-96d8-f8307904f1d8" containerID="e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875" exitCode=0 Apr 16 18:56:49.079468 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:49.079070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" event={"ID":"a670ea36-0015-4862-96d8-f8307904f1d8","Type":"ContainerDied","Data":"e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875"} Apr 16 18:56:49.080561 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:49.080536 2576 generic.go:358] "Generic (PLEG): container finished" podID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerID="177915605fc3a02415fd0bff82ba361a6a2bbc0639050a93e49ed29729dff286" exitCode=0 Apr 16 18:56:49.080680 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:49.080589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" event={"ID":"b24b99b6-ff16-48f2-b45b-723c793fa32a","Type":"ContainerDied","Data":"177915605fc3a02415fd0bff82ba361a6a2bbc0639050a93e49ed29729dff286"} Apr 16 18:56:50.087661 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:50.087621 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" event={"ID":"b24b99b6-ff16-48f2-b45b-723c793fa32a","Type":"ContainerStarted","Data":"ea6fe1d1269b04409a8706e29b7fab7c417fdb1a1e7a2db1c1abf8a1f7da4863"} Apr 16 18:56:50.089799 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:50.089769 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" event={"ID":"a670ea36-0015-4862-96d8-f8307904f1d8","Type":"ContainerStarted","Data":"0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4"} Apr 16 18:56:50.113014 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:50.112947 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podStartSLOduration=7.112930819 podStartE2EDuration="7.112930819s" podCreationTimestamp="2026-04-16 18:56:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:56:50.108841616 +0000 UTC m=+1536.233369279" watchObservedRunningTime="2026-04-16 18:56:50.112930819 +0000 UTC m=+1536.237458485" Apr 16 18:56:50.131013 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:50.130953 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podStartSLOduration=7.13093177 podStartE2EDuration="7.13093177s" podCreationTimestamp="2026-04-16 18:56:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:56:50.129955408 +0000 UTC m=+1536.254483114" watchObservedRunningTime="2026-04-16 18:56:50.13093177 +0000 UTC m=+1536.255459436" Apr 16 18:56:53.436080 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:53.436040 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:53.436080 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:53.436088 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:53.437560 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:53.437485 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:56:53.449439 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:53.449402 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:56:53.461161 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:53.461129 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:53.461161 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:53.461174 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:56:53.463013 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:56:53.462971 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:57:03.436697 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:03.436646 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:57:03.461141 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:03.461092 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:57:07.918583 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:07.918495 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="llm-d-routing-sidecar" containerID="cri-o://c7c9531344e9629619fc8783aed021702cc9fe1ba7592c8011080106c4d8c52b" gracePeriod=2 Apr 16 18:57:08.178304 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.178165 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-996756845-ljdqk_c2aca90b-c1f9-4902-bc21-d2d6fdb992b7/main/0.log" Apr 16 18:57:08.179122 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.178910 2576 generic.go:358] "Generic (PLEG): container finished" podID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerID="6fa46587cc664b3a0d7a9e85bc8437e7e5e60477666b9f6723979a3579955a10" exitCode=137 Apr 16 18:57:08.179122 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.178941 2576 generic.go:358] "Generic (PLEG): container finished" podID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerID="c7c9531344e9629619fc8783aed021702cc9fe1ba7592c8011080106c4d8c52b" exitCode=0 Apr 16 18:57:08.179122 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.179043 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" event={"ID":"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7","Type":"ContainerDied","Data":"6fa46587cc664b3a0d7a9e85bc8437e7e5e60477666b9f6723979a3579955a10"} Apr 16 18:57:08.179122 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.179074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" event={"ID":"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7","Type":"ContainerDied","Data":"c7c9531344e9629619fc8783aed021702cc9fe1ba7592c8011080106c4d8c52b"} Apr 16 18:57:08.181597 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.181426 2576 generic.go:358] "Generic (PLEG): container finished" podID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerID="5cab1c4239a39c1cce4e58bdff9b71adc09e0e9a8d530e0f23579ab4849f7d8b" exitCode=137 Apr 16 18:57:08.181597 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.181508 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" event={"ID":"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c","Type":"ContainerDied","Data":"5cab1c4239a39c1cce4e58bdff9b71adc09e0e9a8d530e0f23579ab4849f7d8b"} Apr 16 18:57:08.327786 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.327718 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:57:08.341259 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.341231 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-996756845-ljdqk_c2aca90b-c1f9-4902-bc21-d2d6fdb992b7/main/0.log" Apr 16 18:57:08.342078 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.342057 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:57:08.392183 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392154 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-tls-certs\") pod \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " Apr 16 18:57:08.392391 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392196 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-home\") pod \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " Apr 16 18:57:08.392391 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392224 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-tls-certs\") pod \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " Apr 16 18:57:08.392391 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392278 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-model-cache\") pod \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " Apr 16 18:57:08.392391 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392315 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts4nd\" (UniqueName: \"kubernetes.io/projected/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-kube-api-access-ts4nd\") pod \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " Apr 16 18:57:08.392391 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392357 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-model-cache\") pod \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " Apr 16 18:57:08.392391 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392380 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-dshm\") pod \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " Apr 16 18:57:08.392713 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392426 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdrkz\" (UniqueName: \"kubernetes.io/projected/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-kube-api-access-pdrkz\") pod \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " Apr 16 18:57:08.392713 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392460 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-kserve-provision-location\") pod \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " Apr 16 18:57:08.392713 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392485 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-dshm\") pod \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " Apr 16 18:57:08.392713 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392611 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-kserve-provision-location\") pod \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\" (UID: \"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c\") " Apr 16 18:57:08.392713 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392639 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-home\") pod \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\" (UID: \"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7\") " Apr 16 18:57:08.392713 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392647 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-home" (OuterVolumeSpecName: "home") pod "f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" (UID: "f1ac7c66-cb5a-4c19-b56c-8e797b883e5c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:57:08.393019 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392942 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:57:08.393019 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.392983 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-model-cache" (OuterVolumeSpecName: "model-cache") pod "f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" (UID: "f1ac7c66-cb5a-4c19-b56c-8e797b883e5c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:57:08.395251 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.393162 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-model-cache" (OuterVolumeSpecName: "model-cache") pod "c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" (UID: "c2aca90b-c1f9-4902-bc21-d2d6fdb992b7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:57:08.395251 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.393436 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-home" (OuterVolumeSpecName: "home") pod "c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" (UID: "c2aca90b-c1f9-4902-bc21-d2d6fdb992b7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:57:08.397554 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.397372 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-kube-api-access-pdrkz" (OuterVolumeSpecName: "kube-api-access-pdrkz") pod "f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" (UID: "f1ac7c66-cb5a-4c19-b56c-8e797b883e5c"). InnerVolumeSpecName "kube-api-access-pdrkz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:57:08.397554 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.397488 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" (UID: "c2aca90b-c1f9-4902-bc21-d2d6fdb992b7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:57:08.397847 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.397647 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-dshm" (OuterVolumeSpecName: "dshm") pod "f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" (UID: "f1ac7c66-cb5a-4c19-b56c-8e797b883e5c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:57:08.397847 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.397708 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-dshm" (OuterVolumeSpecName: "dshm") pod "c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" (UID: "c2aca90b-c1f9-4902-bc21-d2d6fdb992b7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:57:08.398176 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.398132 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-kube-api-access-ts4nd" (OuterVolumeSpecName: "kube-api-access-ts4nd") pod "c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" (UID: "c2aca90b-c1f9-4902-bc21-d2d6fdb992b7"). InnerVolumeSpecName "kube-api-access-ts4nd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:57:08.399100 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.399066 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" (UID: "f1ac7c66-cb5a-4c19-b56c-8e797b883e5c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:57:08.476550 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.476501 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" (UID: "f1ac7c66-cb5a-4c19-b56c-8e797b883e5c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:57:08.481083 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.481051 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" (UID: "c2aca90b-c1f9-4902-bc21-d2d6fdb992b7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:57:08.493589 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.493553 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:57:08.493589 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.493584 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ts4nd\" (UniqueName: \"kubernetes.io/projected/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-kube-api-access-ts4nd\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:57:08.493836 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.493599 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:57:08.493836 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.493611 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:57:08.493836 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.493622 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pdrkz\" (UniqueName: \"kubernetes.io/projected/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-kube-api-access-pdrkz\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:57:08.493836 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.493637 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:57:08.493836 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.493649 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:57:08.493836 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.493663 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:57:08.493836 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.493676 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:57:08.493836 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.493688 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:57:08.493836 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:08.493700 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:57:09.187492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.187445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" event={"ID":"f1ac7c66-cb5a-4c19-b56c-8e797b883e5c","Type":"ContainerDied","Data":"b9a5bb598c096fcbe37130739f1802424004f95949cb411289cb9d14eabc32cd"} Apr 16 18:57:09.187492 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.187486 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t" Apr 16 18:57:09.188666 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.187507 2576 scope.go:117] "RemoveContainer" containerID="5cab1c4239a39c1cce4e58bdff9b71adc09e0e9a8d530e0f23579ab4849f7d8b" Apr 16 18:57:09.189804 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.189637 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-996756845-ljdqk_c2aca90b-c1f9-4902-bc21-d2d6fdb992b7/main/0.log" Apr 16 18:57:09.190872 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.190843 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" event={"ID":"c2aca90b-c1f9-4902-bc21-d2d6fdb992b7","Type":"ContainerDied","Data":"5d404b1d631a95f8515dddd51341e425e6f40265bea965185759fc762b45341a"} Apr 16 18:57:09.190986 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.190916 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk" Apr 16 18:57:09.213000 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.212963 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t"] Apr 16 18:57:09.214834 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.214814 2576 scope.go:117] "RemoveContainer" containerID="c835358d5140a8b00b0fa20d0ebda004ff67634e0ce17902c1be153279aad22d" Apr 16 18:57:09.215085 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.215055 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7577b8f6f8-w5g9t"] Apr 16 18:57:09.230978 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.230571 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk"] Apr 16 18:57:09.233376 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.233348 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-996756845-ljdqk"] Apr 16 18:57:09.271639 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.271615 2576 scope.go:117] "RemoveContainer" containerID="6fa46587cc664b3a0d7a9e85bc8437e7e5e60477666b9f6723979a3579955a10" Apr 16 18:57:09.298346 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.298320 2576 scope.go:117] "RemoveContainer" containerID="36e1fd131f1db01e0e74404c8960937bc7c31f2772a15d28f2618cacfcfcfd56" Apr 16 18:57:09.365526 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:09.365501 2576 scope.go:117] "RemoveContainer" containerID="c7c9531344e9629619fc8783aed021702cc9fe1ba7592c8011080106c4d8c52b" Apr 16 18:57:10.507025 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:10.506992 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" path="/var/lib/kubelet/pods/c2aca90b-c1f9-4902-bc21-d2d6fdb992b7/volumes" Apr 16 18:57:10.507485 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:10.507472 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" path="/var/lib/kubelet/pods/f1ac7c66-cb5a-4c19-b56c-8e797b883e5c/volumes" Apr 16 18:57:13.436206 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:13.436144 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:57:13.461179 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:13.461135 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:57:23.436897 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:23.436846 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:57:23.461439 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:23.461400 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:57:33.436634 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:33.436572 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:57:33.460936 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:33.460882 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:57:43.436019 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:43.435969 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:57:43.461020 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:43.460974 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:57:53.436114 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:53.436070 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:57:53.462988 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:57:53.462935 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:58:03.435981 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:58:03.435874 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:58:03.460625 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:58:03.460587 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:58:13.436812 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:58:13.436765 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:58:13.460628 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:58:13.460587 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:58:23.436573 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:58:23.436499 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:58:23.461303 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:58:23.461264 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:58:33.436788 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:58:33.436721 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:58:33.461379 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:58:33.461322 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:58:43.436613 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:58:43.436559 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:58:43.460940 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:58:43.460894 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:58:53.436627 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:58:53.436563 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:58:53.460659 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:58:53.460623 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:59:03.436387 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:03.436338 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 18:59:03.460977 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:03.460929 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 18:59:13.446245 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:13.446214 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:59:13.458562 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:13.458537 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:59:13.470535 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:13.470506 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:59:13.479202 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:13.479168 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:59:25.444097 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:25.444061 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88"] Apr 16 18:59:25.444686 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:25.444453 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" containerID="cri-o://ea6fe1d1269b04409a8706e29b7fab7c417fdb1a1e7a2db1c1abf8a1f7da4863" gracePeriod=30 Apr 16 18:59:25.448714 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:25.448681 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj"] Apr 16 18:59:25.449073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:25.449030 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" containerID="cri-o://0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4" gracePeriod=30 Apr 16 18:59:40.929151 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:40.929076 2576 ???:1] "http2: server: error reading preface from client 10.0.137.47:38474: read tcp 10.0.140.154:10250->10.0.137.47:38474: read: connection reset by peer" Apr 16 18:59:40.934625 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:40.934589 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:40.960566 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:40.960525 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:40.985916 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:40.985884 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:40.993579 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:40.993552 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:41.004654 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:41.004620 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:41.025418 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:41.025378 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:41.032555 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:41.032529 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:42.011486 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:42.011461 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:42.025795 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:42.025769 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:42.051950 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:42.051911 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:42.059314 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:42.059292 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:42.069205 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:42.069178 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:42.090667 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:42.090634 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:42.097930 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:42.097896 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:43.054542 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:43.054498 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:43.068487 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:43.068463 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:43.093729 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:43.093684 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:43.101508 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:43.101483 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:43.112214 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:43.112179 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:43.132466 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:43.132438 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:43.139301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:43.139276 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:44.054393 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:44.054357 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:44.071436 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:44.071407 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:44.096977 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:44.096943 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:44.104206 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:44.104180 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:44.114966 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:44.114943 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:44.136454 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:44.136425 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:44.143034 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:44.143009 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:45.070780 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:45.070735 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:45.085232 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:45.085206 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:45.111434 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:45.111396 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:45.123562 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:45.123531 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:45.135342 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:45.135317 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:45.155627 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:45.155603 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:45.161564 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:45.161543 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:46.084127 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:46.084099 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:46.099725 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:46.099699 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:46.125387 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:46.125360 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:46.133507 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:46.133482 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:46.145433 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:46.145408 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:46.168610 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:46.168580 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:46.176575 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:46.176551 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:47.118166 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:47.118132 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:47.132301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:47.132271 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:47.160139 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:47.160114 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:47.168223 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:47.168198 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:47.179292 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:47.179262 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:47.198970 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:47.198941 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:47.204946 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:47.204922 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:48.116499 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:48.116462 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:48.130356 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:48.130332 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:48.155539 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:48.155504 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:48.163540 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:48.163511 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:48.176660 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:48.176636 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:48.199780 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:48.199733 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:48.207240 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:48.207207 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:49.140907 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:49.140876 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:49.154577 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:49.154542 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:49.182633 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:49.182559 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:49.190735 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:49.190710 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:49.201812 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:49.201789 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:49.221532 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:49.221501 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:49.227970 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:49.227931 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:50.162270 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:50.162236 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:50.176422 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:50.176397 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:50.204355 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:50.204317 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:50.211624 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:50.211583 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:50.222408 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:50.222380 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:50.242968 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:50.242930 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:50.249799 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:50.249775 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:51.184838 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:51.184808 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:51.199427 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:51.199394 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:51.224755 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:51.224680 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:51.232301 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:51.232274 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:51.242921 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:51.242893 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:51.264552 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:51.264527 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:51.273073 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:51.273048 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:52.272085 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:52.272044 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:52.285537 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:52.285509 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:52.312410 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:52.312378 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:52.322921 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:52.322899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:52.332386 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:52.332362 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:52.354169 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:52.354138 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:52.361117 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:52.361094 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:53.340957 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:53.340920 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:53.357136 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:53.357109 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:53.388106 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:53.388078 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:53.395579 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:53.395560 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:53.407289 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:53.407263 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:53.426310 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:53.426277 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:53.433657 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:53.433636 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:54.372762 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:54.372709 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-7zf2t_432c6f05-8a90-4233-b480-3acc0d695596/istio-proxy/0.log" Apr 16 18:59:54.387311 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:54.387285 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-dltw9_0866b228-7117-4c13-9d4a-1ca7ba74346b/istio-proxy/0.log" Apr 16 18:59:54.415154 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:54.415125 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:54.437089 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:54.437055 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/llm-d-routing-sidecar/0.log" Apr 16 18:59:54.447699 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:54.447673 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/storage-initializer/0.log" Apr 16 18:59:54.467990 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:54.467964 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/main/0.log" Apr 16 18:59:54.475469 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:54.475432 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88_b24b99b6-ff16-48f2-b45b-723c793fa32a/storage-initializer/0.log" Apr 16 18:59:55.449903 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.449867 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="llm-d-routing-sidecar" containerID="cri-o://e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b" gracePeriod=2 Apr 16 18:59:55.464508 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.464481 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-fhvkb_406e9ecb-57cc-43df-aaa2-16fd037884da/istio-proxy/0.log" Apr 16 18:59:55.731800 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.731773 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:55.732398 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.732383 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:59:55.753560 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.753524 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:59:55.838584 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.838541 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpbqd\" (UniqueName: \"kubernetes.io/projected/a670ea36-0015-4862-96d8-f8307904f1d8-kube-api-access-vpbqd\") pod \"a670ea36-0015-4862-96d8-f8307904f1d8\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " Apr 16 18:59:55.838584 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.838585 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-model-cache\") pod \"a670ea36-0015-4862-96d8-f8307904f1d8\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " Apr 16 18:59:55.838871 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.838623 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-kserve-provision-location\") pod \"a670ea36-0015-4862-96d8-f8307904f1d8\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " Apr 16 18:59:55.838871 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.838791 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-dshm\") pod \"a670ea36-0015-4862-96d8-f8307904f1d8\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " Apr 16 18:59:55.838995 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.838885 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a670ea36-0015-4862-96d8-f8307904f1d8-tls-certs\") pod \"a670ea36-0015-4862-96d8-f8307904f1d8\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " Apr 16 18:59:55.838995 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.838912 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-home\") pod \"a670ea36-0015-4862-96d8-f8307904f1d8\" (UID: \"a670ea36-0015-4862-96d8-f8307904f1d8\") " Apr 16 18:59:55.838995 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.838920 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-model-cache" (OuterVolumeSpecName: "model-cache") pod "a670ea36-0015-4862-96d8-f8307904f1d8" (UID: "a670ea36-0015-4862-96d8-f8307904f1d8"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:55.839203 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.839167 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:59:55.839381 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.839357 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-home" (OuterVolumeSpecName: "home") pod "a670ea36-0015-4862-96d8-f8307904f1d8" (UID: "a670ea36-0015-4862-96d8-f8307904f1d8"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:55.840858 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.840835 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a670ea36-0015-4862-96d8-f8307904f1d8-kube-api-access-vpbqd" (OuterVolumeSpecName: "kube-api-access-vpbqd") pod "a670ea36-0015-4862-96d8-f8307904f1d8" (UID: "a670ea36-0015-4862-96d8-f8307904f1d8"). InnerVolumeSpecName "kube-api-access-vpbqd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:59:55.841265 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.841243 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a670ea36-0015-4862-96d8-f8307904f1d8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a670ea36-0015-4862-96d8-f8307904f1d8" (UID: "a670ea36-0015-4862-96d8-f8307904f1d8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:59:55.841332 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.841247 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-dshm" (OuterVolumeSpecName: "dshm") pod "a670ea36-0015-4862-96d8-f8307904f1d8" (UID: "a670ea36-0015-4862-96d8-f8307904f1d8"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:55.901065 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.901022 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a670ea36-0015-4862-96d8-f8307904f1d8" (UID: "a670ea36-0015-4862-96d8-f8307904f1d8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:55.927635 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.927602 2576 generic.go:358] "Generic (PLEG): container finished" podID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerID="ea6fe1d1269b04409a8706e29b7fab7c417fdb1a1e7a2db1c1abf8a1f7da4863" exitCode=137 Apr 16 18:59:55.927796 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.927688 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" Apr 16 18:59:55.927796 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.927688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" event={"ID":"b24b99b6-ff16-48f2-b45b-723c793fa32a","Type":"ContainerDied","Data":"ea6fe1d1269b04409a8706e29b7fab7c417fdb1a1e7a2db1c1abf8a1f7da4863"} Apr 16 18:59:55.927796 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.927731 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88" event={"ID":"b24b99b6-ff16-48f2-b45b-723c793fa32a","Type":"ContainerDied","Data":"c419a06455128ffaa4f691780114b82d7c7f5e10399bb6e693a217feac7306ad"} Apr 16 18:59:55.927796 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.927773 2576 scope.go:117] "RemoveContainer" containerID="ea6fe1d1269b04409a8706e29b7fab7c417fdb1a1e7a2db1c1abf8a1f7da4863" Apr 16 18:59:55.929035 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.929018 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6cb9df684f-w8ntj_a670ea36-0015-4862-96d8-f8307904f1d8/main/0.log" Apr 16 18:59:55.929566 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.929547 2576 generic.go:358] "Generic (PLEG): container finished" podID="a670ea36-0015-4862-96d8-f8307904f1d8" containerID="0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4" exitCode=137 Apr 16 18:59:55.929566 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.929565 2576 generic.go:358] "Generic (PLEG): container finished" podID="a670ea36-0015-4862-96d8-f8307904f1d8" containerID="e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b" exitCode=0 Apr 16 18:59:55.929692 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.929574 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" event={"ID":"a670ea36-0015-4862-96d8-f8307904f1d8","Type":"ContainerDied","Data":"0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4"} Apr 16 18:59:55.929692 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.929610 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" event={"ID":"a670ea36-0015-4862-96d8-f8307904f1d8","Type":"ContainerDied","Data":"e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b"} Apr 16 18:59:55.929692 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.929621 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" event={"ID":"a670ea36-0015-4862-96d8-f8307904f1d8","Type":"ContainerDied","Data":"61eb7646a3c51816efc8cea13e65d8d67139d2d61a479578d36b8a9043b337d0"} Apr 16 18:59:55.929692 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.929621 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj" Apr 16 18:59:55.939891 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.939870 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-home\") pod \"b24b99b6-ff16-48f2-b45b-723c793fa32a\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " Apr 16 18:59:55.940006 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.939924 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-dshm\") pod \"b24b99b6-ff16-48f2-b45b-723c793fa32a\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " Apr 16 18:59:55.940006 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.939945 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-kserve-provision-location\") pod \"b24b99b6-ff16-48f2-b45b-723c793fa32a\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " Apr 16 18:59:55.940006 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.939962 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-model-cache\") pod \"b24b99b6-ff16-48f2-b45b-723c793fa32a\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " Apr 16 18:59:55.940151 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.940007 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtjgq\" (UniqueName: \"kubernetes.io/projected/b24b99b6-ff16-48f2-b45b-723c793fa32a-kube-api-access-jtjgq\") pod \"b24b99b6-ff16-48f2-b45b-723c793fa32a\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " Apr 16 18:59:55.940151 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.940040 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b24b99b6-ff16-48f2-b45b-723c793fa32a-tls-certs\") pod \"b24b99b6-ff16-48f2-b45b-723c793fa32a\" (UID: \"b24b99b6-ff16-48f2-b45b-723c793fa32a\") " Apr 16 18:59:55.940257 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.940233 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-model-cache" (OuterVolumeSpecName: "model-cache") pod "b24b99b6-ff16-48f2-b45b-723c793fa32a" (UID: "b24b99b6-ff16-48f2-b45b-723c793fa32a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:55.940310 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.940254 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:59:55.940310 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.940274 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a670ea36-0015-4862-96d8-f8307904f1d8-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:59:55.940310 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.940290 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:59:55.940310 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.940288 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-home" (OuterVolumeSpecName: "home") pod "b24b99b6-ff16-48f2-b45b-723c793fa32a" (UID: "b24b99b6-ff16-48f2-b45b-723c793fa32a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:55.940310 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.940304 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vpbqd\" (UniqueName: \"kubernetes.io/projected/a670ea36-0015-4862-96d8-f8307904f1d8-kube-api-access-vpbqd\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:59:55.940557 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.940320 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a670ea36-0015-4862-96d8-f8307904f1d8-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:59:55.942130 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.942104 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b24b99b6-ff16-48f2-b45b-723c793fa32a-kube-api-access-jtjgq" (OuterVolumeSpecName: "kube-api-access-jtjgq") pod "b24b99b6-ff16-48f2-b45b-723c793fa32a" (UID: "b24b99b6-ff16-48f2-b45b-723c793fa32a"). InnerVolumeSpecName "kube-api-access-jtjgq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:59:55.942613 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.942590 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24b99b6-ff16-48f2-b45b-723c793fa32a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b24b99b6-ff16-48f2-b45b-723c793fa32a" (UID: "b24b99b6-ff16-48f2-b45b-723c793fa32a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:59:55.942676 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.942598 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-dshm" (OuterVolumeSpecName: "dshm") pod "b24b99b6-ff16-48f2-b45b-723c793fa32a" (UID: "b24b99b6-ff16-48f2-b45b-723c793fa32a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:55.952383 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.952359 2576 scope.go:117] "RemoveContainer" containerID="177915605fc3a02415fd0bff82ba361a6a2bbc0639050a93e49ed29729dff286" Apr 16 18:59:55.954304 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.954279 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj"] Apr 16 18:59:55.959873 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:55.959851 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6cb9df684f-w8ntj"] Apr 16 18:59:56.018352 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.018257 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b24b99b6-ff16-48f2-b45b-723c793fa32a" (UID: "b24b99b6-ff16-48f2-b45b-723c793fa32a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:56.031346 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.031326 2576 scope.go:117] "RemoveContainer" containerID="ea6fe1d1269b04409a8706e29b7fab7c417fdb1a1e7a2db1c1abf8a1f7da4863" Apr 16 18:59:56.031641 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:59:56.031619 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea6fe1d1269b04409a8706e29b7fab7c417fdb1a1e7a2db1c1abf8a1f7da4863\": container with ID starting with ea6fe1d1269b04409a8706e29b7fab7c417fdb1a1e7a2db1c1abf8a1f7da4863 not found: ID does not exist" containerID="ea6fe1d1269b04409a8706e29b7fab7c417fdb1a1e7a2db1c1abf8a1f7da4863" Apr 16 18:59:56.031703 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.031652 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea6fe1d1269b04409a8706e29b7fab7c417fdb1a1e7a2db1c1abf8a1f7da4863"} err="failed to get container status \"ea6fe1d1269b04409a8706e29b7fab7c417fdb1a1e7a2db1c1abf8a1f7da4863\": rpc error: code = NotFound desc = could not find container \"ea6fe1d1269b04409a8706e29b7fab7c417fdb1a1e7a2db1c1abf8a1f7da4863\": container with ID starting with ea6fe1d1269b04409a8706e29b7fab7c417fdb1a1e7a2db1c1abf8a1f7da4863 not found: ID does not exist" Apr 16 18:59:56.031703 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.031670 2576 scope.go:117] "RemoveContainer" containerID="177915605fc3a02415fd0bff82ba361a6a2bbc0639050a93e49ed29729dff286" Apr 16 18:59:56.031984 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:59:56.031958 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"177915605fc3a02415fd0bff82ba361a6a2bbc0639050a93e49ed29729dff286\": container with ID starting with 177915605fc3a02415fd0bff82ba361a6a2bbc0639050a93e49ed29729dff286 not found: ID does not exist" containerID="177915605fc3a02415fd0bff82ba361a6a2bbc0639050a93e49ed29729dff286" Apr 16 18:59:56.032039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.031992 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"177915605fc3a02415fd0bff82ba361a6a2bbc0639050a93e49ed29729dff286"} err="failed to get container status \"177915605fc3a02415fd0bff82ba361a6a2bbc0639050a93e49ed29729dff286\": rpc error: code = NotFound desc = could not find container \"177915605fc3a02415fd0bff82ba361a6a2bbc0639050a93e49ed29729dff286\": container with ID starting with 177915605fc3a02415fd0bff82ba361a6a2bbc0639050a93e49ed29729dff286 not found: ID does not exist" Apr 16 18:59:56.032039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.032010 2576 scope.go:117] "RemoveContainer" containerID="0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4" Apr 16 18:59:56.041255 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.041224 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-kserve-provision-location\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:59:56.041255 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.041248 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-model-cache\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:59:56.041396 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.041262 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jtjgq\" (UniqueName: \"kubernetes.io/projected/b24b99b6-ff16-48f2-b45b-723c793fa32a-kube-api-access-jtjgq\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:59:56.041396 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.041271 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b24b99b6-ff16-48f2-b45b-723c793fa32a-tls-certs\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:59:56.041396 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.041279 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-home\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:59:56.041396 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.041287 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b24b99b6-ff16-48f2-b45b-723c793fa32a-dshm\") on node \"ip-10-0-140-154.ec2.internal\" DevicePath \"\"" Apr 16 18:59:56.053608 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.053589 2576 scope.go:117] "RemoveContainer" containerID="e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875" Apr 16 18:59:56.125354 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.125334 2576 scope.go:117] "RemoveContainer" containerID="e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b" Apr 16 18:59:56.133467 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.133447 2576 scope.go:117] "RemoveContainer" containerID="0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4" Apr 16 18:59:56.133750 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:59:56.133711 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4\": container with ID starting with 0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4 not found: ID does not exist" containerID="0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4" Apr 16 18:59:56.133860 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.133761 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4"} err="failed to get container status \"0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4\": rpc error: code = NotFound desc = could not find container \"0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4\": container with ID starting with 0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4 not found: ID does not exist" Apr 16 18:59:56.133860 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.133795 2576 scope.go:117] "RemoveContainer" containerID="e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875" Apr 16 18:59:56.134082 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:59:56.134064 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875\": container with ID starting with e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875 not found: ID does not exist" containerID="e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875" Apr 16 18:59:56.134137 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.134090 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875"} err="failed to get container status \"e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875\": rpc error: code = NotFound desc = could not find container \"e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875\": container with ID starting with e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875 not found: ID does not exist" Apr 16 18:59:56.134137 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.134104 2576 scope.go:117] "RemoveContainer" containerID="e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b" Apr 16 18:59:56.134322 ip-10-0-140-154 kubenswrapper[2576]: E0416 18:59:56.134303 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b\": container with ID starting with e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b not found: ID does not exist" containerID="e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b" Apr 16 18:59:56.134392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.134330 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b"} err="failed to get container status \"e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b\": rpc error: code = NotFound desc = could not find container \"e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b\": container with ID starting with e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b not found: ID does not exist" Apr 16 18:59:56.134392 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.134360 2576 scope.go:117] "RemoveContainer" containerID="0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4" Apr 16 18:59:56.134578 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.134559 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4"} err="failed to get container status \"0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4\": rpc error: code = NotFound desc = could not find container \"0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4\": container with ID starting with 0310a76ac6de0ca4e6d3415bd25449c048c02a8651db05394c8dec92b3636ac4 not found: ID does not exist" Apr 16 18:59:56.134624 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.134579 2576 scope.go:117] "RemoveContainer" containerID="e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875" Apr 16 18:59:56.134847 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.134829 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875"} err="failed to get container status \"e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875\": rpc error: code = NotFound desc = could not find container \"e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875\": container with ID starting with e25e4f22f3b52906635ae8be432cff0819d68ca794d056468e015df06e7ce875 not found: ID does not exist" Apr 16 18:59:56.134911 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.134848 2576 scope.go:117] "RemoveContainer" containerID="e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b" Apr 16 18:59:56.135085 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.135066 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b"} err="failed to get container status \"e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b\": rpc error: code = NotFound desc = could not find container \"e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b\": container with ID starting with e328363462c06fb12811d6ca02f0b21cfc22d5bbe8ea88de886974e1e6d8484b not found: ID does not exist" Apr 16 18:59:56.249931 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.249899 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88"] Apr 16 18:59:56.256027 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.256001 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-64dfc65446-pdj88"] Apr 16 18:59:56.310362 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.310278 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-fhvkb_406e9ecb-57cc-43df-aaa2-16fd037884da/istio-proxy/0.log" Apr 16 18:59:56.506100 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.506065 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" path="/var/lib/kubelet/pods/a670ea36-0015-4862-96d8-f8307904f1d8/volumes" Apr 16 18:59:56.506519 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:56.506508 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" path="/var/lib/kubelet/pods/b24b99b6-ff16-48f2-b45b-723c793fa32a/volumes" Apr 16 18:59:57.126127 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:57.126080 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-fqp2f_757f884b-57ab-45d5-8be4-13b354df0096/manager/0.log" Apr 16 18:59:57.137791 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:57.137760 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-pvggb_065cdea5-5e54-4008-b182-d1a8e64b8540/manager/0.log" Apr 16 18:59:57.199169 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:57.199135 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-84cp9_af2ed6e9-5dfb-40b4-a482-f542057abd67/limitador/0.log" Apr 16 18:59:57.215410 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:57.215384 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-q9cqp_0a59c445-4607-4fc4-ab27-7a8903901fd4/manager/0.log" Apr 16 18:59:59.598568 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.598529 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-djjwc/must-gather-8z587"] Apr 16 18:59:59.599039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.598962 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="storage-initializer" Apr 16 18:59:59.599039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.598976 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="storage-initializer" Apr 16 18:59:59.599039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.598986 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" Apr 16 18:59:59.599039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.598994 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" Apr 16 18:59:59.599039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599005 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="llm-d-routing-sidecar" Apr 16 18:59:59.599039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599013 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="llm-d-routing-sidecar" Apr 16 18:59:59.599039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599021 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="storage-initializer" Apr 16 18:59:59.599039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599027 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="storage-initializer" Apr 16 18:59:59.599039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599035 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="llm-d-routing-sidecar" Apr 16 18:59:59.599039 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599041 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="llm-d-routing-sidecar" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599052 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599057 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599066 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599073 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599087 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="storage-initializer" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599095 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="storage-initializer" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599101 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="storage-initializer" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599105 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="storage-initializer" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599113 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599117 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599189 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="main" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599198 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1ac7c66-cb5a-4c19-b56c-8e797b883e5c" containerName="main" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599214 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a670ea36-0015-4862-96d8-f8307904f1d8" containerName="llm-d-routing-sidecar" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599222 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b24b99b6-ff16-48f2-b45b-723c793fa32a" containerName="main" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599229 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="llm-d-routing-sidecar" Apr 16 18:59:59.599365 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.599237 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2aca90b-c1f9-4902-bc21-d2d6fdb992b7" containerName="main" Apr 16 18:59:59.606121 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.606102 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-djjwc/must-gather-8z587" Apr 16 18:59:59.608966 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.608943 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-djjwc\"/\"kube-root-ca.crt\"" Apr 16 18:59:59.609310 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.609279 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-djjwc\"/\"openshift-service-ca.crt\"" Apr 16 18:59:59.609417 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.609400 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-djjwc\"/\"default-dockercfg-bxwpq\"" Apr 16 18:59:59.611151 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.611130 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-djjwc/must-gather-8z587"] Apr 16 18:59:59.772618 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.772572 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb-must-gather-output\") pod \"must-gather-8z587\" (UID: \"81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb\") " pod="openshift-must-gather-djjwc/must-gather-8z587" Apr 16 18:59:59.772843 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.772632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrmv9\" (UniqueName: \"kubernetes.io/projected/81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb-kube-api-access-rrmv9\") pod \"must-gather-8z587\" (UID: \"81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb\") " pod="openshift-must-gather-djjwc/must-gather-8z587" Apr 16 18:59:59.873919 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.873830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb-must-gather-output\") pod \"must-gather-8z587\" (UID: \"81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb\") " pod="openshift-must-gather-djjwc/must-gather-8z587" Apr 16 18:59:59.873919 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.873886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrmv9\" (UniqueName: \"kubernetes.io/projected/81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb-kube-api-access-rrmv9\") pod \"must-gather-8z587\" (UID: \"81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb\") " pod="openshift-must-gather-djjwc/must-gather-8z587" Apr 16 18:59:59.874238 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.874217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb-must-gather-output\") pod \"must-gather-8z587\" (UID: \"81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb\") " pod="openshift-must-gather-djjwc/must-gather-8z587" Apr 16 18:59:59.883446 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.883419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrmv9\" (UniqueName: \"kubernetes.io/projected/81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb-kube-api-access-rrmv9\") pod \"must-gather-8z587\" (UID: \"81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb\") " pod="openshift-must-gather-djjwc/must-gather-8z587" Apr 16 18:59:59.915393 ip-10-0-140-154 kubenswrapper[2576]: I0416 18:59:59.915356 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-djjwc/must-gather-8z587" Apr 16 19:00:00.054331 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:00.054306 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-djjwc/must-gather-8z587"] Apr 16 19:00:00.056159 ip-10-0-140-154 kubenswrapper[2576]: W0416 19:00:00.056129 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81bf4fcd_59b3_4c1d_8d16_f6f6595ef7bb.slice/crio-89e3f0049dff3668af7ea9a5c22332af73e5a1c688108120dd9aae74e44a0882 WatchSource:0}: Error finding container 89e3f0049dff3668af7ea9a5c22332af73e5a1c688108120dd9aae74e44a0882: Status 404 returned error can't find the container with id 89e3f0049dff3668af7ea9a5c22332af73e5a1c688108120dd9aae74e44a0882 Apr 16 19:00:00.058003 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:00.057985 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:00:00.958571 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:00.958545 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-djjwc/must-gather-8z587" event={"ID":"81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb","Type":"ContainerStarted","Data":"89e3f0049dff3668af7ea9a5c22332af73e5a1c688108120dd9aae74e44a0882"} Apr 16 19:00:01.964652 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:01.964616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-djjwc/must-gather-8z587" event={"ID":"81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb","Type":"ContainerStarted","Data":"6a2671746d47ea7b796650087af8277f2ff13f58d49db20052a1be56d144a5b1"} Apr 16 19:00:01.964652 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:01.964655 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-djjwc/must-gather-8z587" event={"ID":"81bf4fcd-59b3-4c1d-8d16-f6f6595ef7bb","Type":"ContainerStarted","Data":"42ff3fe1bbc1c0e0dabf2faa31236df482928dc6c518022d20761c42f3e4dce6"} Apr 16 19:00:01.984159 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:01.984104 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-djjwc/must-gather-8z587" podStartSLOduration=2.083437116 podStartE2EDuration="2.984087947s" podCreationTimestamp="2026-04-16 18:59:59 +0000 UTC" firstStartedPulling="2026-04-16 19:00:00.058123925 +0000 UTC m=+1726.182651567" lastFinishedPulling="2026-04-16 19:00:00.958774755 +0000 UTC m=+1727.083302398" observedRunningTime="2026-04-16 19:00:01.981984364 +0000 UTC m=+1728.106512044" watchObservedRunningTime="2026-04-16 19:00:01.984087947 +0000 UTC m=+1728.108615609" Apr 16 19:00:02.731427 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:02.731391 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gvvsb_a21bddd4-aa6a-4559-aa1a-16b654ad1b17/global-pull-secret-syncer/0.log" Apr 16 19:00:02.840935 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:02.840852 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-p4gk6_389b3d12-4b63-4bc3-9047-fb35ce314e95/konnectivity-agent/0.log" Apr 16 19:00:02.916701 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:02.916667 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-154.ec2.internal_c2e9dc4277c032ca5bf1e53fbceaa447/haproxy/0.log" Apr 16 19:00:07.104672 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:07.104635 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-fqp2f_757f884b-57ab-45d5-8be4-13b354df0096/manager/0.log" Apr 16 19:00:07.136624 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:07.136545 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-pvggb_065cdea5-5e54-4008-b182-d1a8e64b8540/manager/0.log" Apr 16 19:00:07.230538 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:07.230508 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-84cp9_af2ed6e9-5dfb-40b4-a482-f542057abd67/limitador/0.log" Apr 16 19:00:07.261236 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:07.261206 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-q9cqp_0a59c445-4607-4fc4-ab27-7a8903901fd4/manager/0.log" Apr 16 19:00:08.645966 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:08.645934 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nsjd9_07ae8592-c64d-4565-9efc-bc0241db5258/node-exporter/0.log" Apr 16 19:00:08.670866 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:08.670823 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nsjd9_07ae8592-c64d-4565-9efc-bc0241db5258/kube-rbac-proxy/0.log" Apr 16 19:00:08.692637 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:08.692604 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nsjd9_07ae8592-c64d-4565-9efc-bc0241db5258/init-textfile/0.log" Apr 16 19:00:09.277764 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:09.273569 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-d85f58867-lj8hr_d9a09d8f-414f-4cdb-9b64-6912ca4cfff7/telemeter-client/0.log" Apr 16 19:00:09.302386 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:09.302303 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-d85f58867-lj8hr_d9a09d8f-414f-4cdb-9b64-6912ca4cfff7/reload/0.log" Apr 16 19:00:09.328008 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:09.327899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-d85f58867-lj8hr_d9a09d8f-414f-4cdb-9b64-6912ca4cfff7/kube-rbac-proxy/0.log" Apr 16 19:00:11.750081 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.750042 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk"] Apr 16 19:00:11.754504 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.754478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.765101 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.765068 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk"] Apr 16 19:00:11.787981 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.787952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30b5680c-e494-4a6e-aca0-381bd8c49a78-sys\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.787981 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.787986 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/30b5680c-e494-4a6e-aca0-381bd8c49a78-podres\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.788185 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.788011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gjqh\" (UniqueName: \"kubernetes.io/projected/30b5680c-e494-4a6e-aca0-381bd8c49a78-kube-api-access-8gjqh\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.788185 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.788097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30b5680c-e494-4a6e-aca0-381bd8c49a78-lib-modules\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.788185 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.788137 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/30b5680c-e494-4a6e-aca0-381bd8c49a78-proc\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.889215 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.889180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30b5680c-e494-4a6e-aca0-381bd8c49a78-sys\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.889215 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.889219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/30b5680c-e494-4a6e-aca0-381bd8c49a78-podres\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.889433 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.889241 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gjqh\" (UniqueName: \"kubernetes.io/projected/30b5680c-e494-4a6e-aca0-381bd8c49a78-kube-api-access-8gjqh\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.889433 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.889280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30b5680c-e494-4a6e-aca0-381bd8c49a78-lib-modules\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.889433 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.889311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/30b5680c-e494-4a6e-aca0-381bd8c49a78-proc\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.889433 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.889311 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30b5680c-e494-4a6e-aca0-381bd8c49a78-sys\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.889433 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.889369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/30b5680c-e494-4a6e-aca0-381bd8c49a78-podres\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.889433 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.889409 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/30b5680c-e494-4a6e-aca0-381bd8c49a78-proc\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.889433 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.889424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30b5680c-e494-4a6e-aca0-381bd8c49a78-lib-modules\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:11.898681 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:11.898648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gjqh\" (UniqueName: \"kubernetes.io/projected/30b5680c-e494-4a6e-aca0-381bd8c49a78-kube-api-access-8gjqh\") pod \"perf-node-gather-daemonset-phlxk\" (UID: \"30b5680c-e494-4a6e-aca0-381bd8c49a78\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:12.066826 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:12.066716 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:12.223287 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:12.223256 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk"] Apr 16 19:00:12.225159 ip-10-0-140-154 kubenswrapper[2576]: W0416 19:00:12.225125 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod30b5680c_e494_4a6e_aca0_381bd8c49a78.slice/crio-ab45fd8ce489f74dcc9af2c44dc90de9098fbb694f5c3cc32eead17f75df5bc4 WatchSource:0}: Error finding container ab45fd8ce489f74dcc9af2c44dc90de9098fbb694f5c3cc32eead17f75df5bc4: Status 404 returned error can't find the container with id ab45fd8ce489f74dcc9af2c44dc90de9098fbb694f5c3cc32eead17f75df5bc4 Apr 16 19:00:12.887428 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:12.887396 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g54vn_49dc18c9-317a-4547-b453-9328583c7dce/dns/0.log" Apr 16 19:00:12.909322 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:12.909294 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g54vn_49dc18c9-317a-4547-b453-9328583c7dce/kube-rbac-proxy/0.log" Apr 16 19:00:12.995643 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:12.995614 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8pbpt_412ee020-08a8-48fa-a1be-e9b5ca0d1cb5/dns-node-resolver/0.log" Apr 16 19:00:13.022613 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:13.022582 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" event={"ID":"30b5680c-e494-4a6e-aca0-381bd8c49a78","Type":"ContainerStarted","Data":"2ea3868e7d3140f70306c881d57b742596133c0c76cb9974cd296699ff32c271"} Apr 16 19:00:13.022613 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:13.022616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" event={"ID":"30b5680c-e494-4a6e-aca0-381bd8c49a78","Type":"ContainerStarted","Data":"ab45fd8ce489f74dcc9af2c44dc90de9098fbb694f5c3cc32eead17f75df5bc4"} Apr 16 19:00:13.022908 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:13.022674 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:13.045830 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:13.045775 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" podStartSLOduration=2.045730623 podStartE2EDuration="2.045730623s" podCreationTimestamp="2026-04-16 19:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:00:13.043030714 +0000 UTC m=+1739.167558380" watchObservedRunningTime="2026-04-16 19:00:13.045730623 +0000 UTC m=+1739.170258314" Apr 16 19:00:13.559650 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:13.559600 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bnn4v_5b4ead4a-0df1-4977-9c23-9828175001c0/node-ca/0.log" Apr 16 19:00:14.469345 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:14.469315 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-fhvkb_406e9ecb-57cc-43df-aaa2-16fd037884da/istio-proxy/0.log" Apr 16 19:00:14.967758 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:14.967711 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8wfh2_4bfb19bf-a34c-40ab-8651-16b8064977ed/serve-healthcheck-canary/0.log" Apr 16 19:00:15.555910 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:15.555877 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fphbb_06adbfd4-9e4a-4840-99c5-dcb8e483a0bd/kube-rbac-proxy/0.log" Apr 16 19:00:15.578303 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:15.578276 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fphbb_06adbfd4-9e4a-4840-99c5-dcb8e483a0bd/exporter/0.log" Apr 16 19:00:15.603225 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:15.603179 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fphbb_06adbfd4-9e4a-4840-99c5-dcb8e483a0bd/extractor/0.log" Apr 16 19:00:18.296314 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:18.296276 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-2fs8s_a3102384-41f4-424e-969d-4e9dc666a9cf/openshift-lws-operator/0.log" Apr 16 19:00:18.833067 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:18.833038 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7c68cb4fc8-9b8lj_d0c3a4db-0a29-4ca7-bb64-1a36ca5b4b04/manager/0.log" Apr 16 19:00:19.040098 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:19.040062 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-phlxk" Apr 16 19:00:19.355989 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:19.355956 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-wbz6b_f4de7741-e917-4daa-8942-198dcf0f92ea/manager/0.log" Apr 16 19:00:19.375954 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:19.375923 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-p545z_eeb7972a-7f87-4db5-a779-0dcd6a9804a3/s3-init/0.log" Apr 16 19:00:19.404443 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:19.404413 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-mh7bj_45d14e57-1e63-45f6-aadc-bb15ad4ff226/seaweedfs/0.log" Apr 16 19:00:25.518394 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:25.518366 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn2l4_48a38ebc-2033-4b86-99f7-c22d4b6e6ccc/kube-multus-additional-cni-plugins/0.log" Apr 16 19:00:25.539398 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:25.539371 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn2l4_48a38ebc-2033-4b86-99f7-c22d4b6e6ccc/egress-router-binary-copy/0.log" Apr 16 19:00:25.563890 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:25.563851 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn2l4_48a38ebc-2033-4b86-99f7-c22d4b6e6ccc/cni-plugins/0.log" Apr 16 19:00:25.585358 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:25.585327 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn2l4_48a38ebc-2033-4b86-99f7-c22d4b6e6ccc/bond-cni-plugin/0.log" Apr 16 19:00:25.605644 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:25.605617 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn2l4_48a38ebc-2033-4b86-99f7-c22d4b6e6ccc/routeoverride-cni/0.log" Apr 16 19:00:25.626178 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:25.626150 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn2l4_48a38ebc-2033-4b86-99f7-c22d4b6e6ccc/whereabouts-cni-bincopy/0.log" Apr 16 19:00:25.646718 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:25.646682 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn2l4_48a38ebc-2033-4b86-99f7-c22d4b6e6ccc/whereabouts-cni/0.log" Apr 16 19:00:25.855061 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:25.854966 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jktqd_a2547dac-9955-4de0-ba29-8ca57b537b69/kube-multus/0.log" Apr 16 19:00:25.946843 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:25.946809 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cl74p_697ddfb3-adc9-4a63-b5ca-b4b871946a33/network-metrics-daemon/0.log" Apr 16 19:00:25.963958 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:25.963929 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cl74p_697ddfb3-adc9-4a63-b5ca-b4b871946a33/kube-rbac-proxy/0.log" Apr 16 19:00:27.154598 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:27.154546 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78v92_d9e01863-f0f0-4a5e-935f-50699365f569/ovn-controller/0.log" Apr 16 19:00:27.188944 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:27.188909 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78v92_d9e01863-f0f0-4a5e-935f-50699365f569/ovn-acl-logging/0.log" Apr 16 19:00:27.205894 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:27.205852 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78v92_d9e01863-f0f0-4a5e-935f-50699365f569/kube-rbac-proxy-node/0.log" Apr 16 19:00:27.228603 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:27.228572 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78v92_d9e01863-f0f0-4a5e-935f-50699365f569/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:00:27.251183 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:27.251141 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78v92_d9e01863-f0f0-4a5e-935f-50699365f569/northd/0.log" Apr 16 19:00:27.271852 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:27.271824 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78v92_d9e01863-f0f0-4a5e-935f-50699365f569/nbdb/0.log" Apr 16 19:00:27.295188 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:27.295133 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78v92_d9e01863-f0f0-4a5e-935f-50699365f569/sbdb/0.log" Apr 16 19:00:27.514449 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:27.514411 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78v92_d9e01863-f0f0-4a5e-935f-50699365f569/ovnkube-controller/0.log" Apr 16 19:00:29.024697 ip-10-0-140-154 kubenswrapper[2576]: I0416 19:00:29.024663 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-q67qs_5f3c3f50-8ee2-4775-a3de-64e723a55361/network-check-target-container/0.log"